Jump to content

Software Developer – Ribbon Analytics - WESTFORD

8 days ago


 Share

Job Opportunity Details

Type

Full Time

Salary

Not Telling

Work from home

No

Weekly Working Hours

Not Telling

Positions

Not Telling

Working Location

WESTFORD, USA, MA, Westford, United States   [ View map ]

ABOUT RIBBON COMMUNICATIONS
Ribbon Communications is a company with two decades of leadership in real-time communications. Built on world-class technology and intellectual property, the company delivers intelligent, secure, embedded real-time communications for today’s world.  The company transforms fixed, mobile and enterprise networks from legacy environments to secure IP and cloud-based architectures, enabling highly productive communications for consumers and businesses.  With 64 locations in 27 countries around the globe, Ribbon’s innovative, market-leading portfolio empowers service providers and enterprises with rapid service creation in a fully virtualized environment.  To learn more, visit rbbn.com.

OPPORTUNITY

Ribbon Communications is looking for a software developer to assist in the design and development of features on Ribbon Analytics.  Ribbon Analytics is a big data Network Analytics and Security product that collects, processes and reacts to massive amounts of data collected from the network, leveraging Machine Learning and other techniques to anylize trends and outliers in the data and take action to mitigate security threats, fraud etc in a customer’s network.

The position will be within the Ribbon Technology and Solutions development team, working on the latest technologies in the Big Data and Analytics field using contemporary data visualization and UI frameworks as a front end to the latest Big Data platform engines such as Kubernetes/Docker, Hadoop, and Angular within a virtualized, micro-services application architecture.

A successful candidate must be self-driven, possess a strong work ethic, with a career interest in software development of highly scalable applications. They must be excited about working with new technologies and comfortable working in a dynamic work environment.

What you will be doing: (Responsibilities)

  • Build and optimize real-time and batch data processing systems, ensuring high availability, fault tolerance, and scalability.
  • Architect and develop large-scale, distributed data processing pipelines using technologies like Apache Trino/Impala, Flink, and Airflow.
  • Design and implement efficient data ingestion, transformation, and storage solutions for structured and unstructured data.
  • Partner closely with Engineering Leaders, Architects, and Product Managers to understand business requirements and provide technical solutions within a larger roadmap.
  • Collaborate with data engineers, analysts, and scientists to understand business requirements and translate them into technical solutions.
  • Implement best practices for data governance, data quality, and data security across the entire data lifecycle.
  • Mentor and guide junior engineers, fostering a culture of continuous learning and knowledge sharing.
  • Stay up-to-date with the latest trends, technologies, and industry best practices in the big data and data engineering domains.
  • Participate in code reviews, design discussions, and technical decision-making processes.
  • Contribute to the development and maintenance of CI/CD pipelines, ensuring efficient and reliable deployments.
  • Collaborate with cross-functional teams to ensure the successful delivery of projects and initiatives.

What we need to see: (Qualifications)

  • 8+ years of experience in software engineering
  • 5+ years working with data engineering technologies.
  • Degree in Computer Science, Electrical Engineering, Computer Engineering, or a related field, ideally with specialization in Data Engineering or Machine Learning
  • Broad expertise and experience with distributed systems, streaming systems, and data engineering tools, such as SQL(Trino/Impala), HDFS, S3, Kubernetes, Airflow, Kafka, Flink, etc.
  • Experienced in engineering data pipelines using big data technologies (Impala, Presto, Spark, Flink) on medium to large scale data sets
  • Deep knowledge of Python, advanced SQL, database technologies.
  • Experience in Java

Ways To Stand Out From The Crowd (Preferred Skills)

  • Experience developing micro service architecture (Kubernetes, Containers, REST API) 
  • Experience developing data analytics solutions in AWS
  • Experience with Deep Learning platforms
  • Experience mentoring and leading junior technical staff

Please Note:

'All qualified applicants will receive consideration for employment without regard to race, age, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, on the basis of disability, or other characteristic protected by applicable law.'

US Citizens and all other parties authorized to work in the US are encouraged to apply.


More Information

Application Details

  • Organization Details
    Ribbon Communications
 Share


User Feedback

Recommended Comments

There are no comments to display.

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...