Job Description
Job Description
Help manage a high-performance team of Data Engineers. Contribute to and help lead team in designing, building, testing, scaling and maintaining data pipelines from a variety of source systems and streams (Internal, third party, cloud based, etc.), according to business and technical requirements. Deliver observable, reliable and secure software, embracing ''you build it you run it'' mentality, and focus on automation and GitOps. Continually work on improving the codebase and have active participation and oversight in all aspects of the team, including agile ceremonies. Take an active role in story definition, assisting business stakeholders with acceptance criteria. Work with Principal Engineers and Architects to share and contribute to the broader technical vision. Develop and champion best practices, striving towards excellence and raising the bar within the department. Develop solutions combining data blending, profiling, mining, statistical analysis, and machine learning, to better define and curate models, test hypothesis, and deliver key insights. Operationalize data processing systems (dev ops).
Fully Remote: This position has been designated as fully remote, meaning that the position is expected to contribute from a non-NBCUniversal worksite, most commonly an employee's residence.
Qualifications
Bachelor's degree in Computer Science, Computer Engineering, Data Engineering, Physics, or a related field (or foreign degree equivalent).
• The position requires five (5) years of experience in the job offered, as a Software Engineer, or in a related occupation.
• The position requires each of the following skills, which must have been gained through five (5) years of experience:
• Programming skills in one or more of the following: Python, Java, Scala, R, SQL;
• Writing reusable/efficient code to automate analysis and data processes;
• Developing data catalogs and data cleanliness to ensure clarity and correctness of key business metrics.
The position requires the following skill, which must have been gained through four (4) years of experience:
• Building and maintaining dimensional data warehouses in support of BI tools.
The position requires each of the following skills, which must have been gained through three (3) years of experience:
• Near Real Time & Batch Data Pipeline development for Big Data;
• Processing structured and unstructured data into a form suitable for analysis and reporting, including integration with a variety of data metric providers (advertising, web analytics, and consumer devices);
• Hands on programming of the following (or similar) technologies: Apache Beam, Scio, Apache Spark, and Snowflake;
• Progressive data application development, working in large scale/distributed SQL, NoSQL, and/or Hadoop environment.
The position requires the following skill, which must have been gained through two (2) years of experience:
• Building streaming data pipelines using Kafka, Spark or Flink.
The position requires the following skill, which must have been gained through one (1) year of experience:
• Implementing scalable, distributed, and highly available systems using Google Cloud.
This position is eligible for company sponsored benefits, including medical, dental and vision insurance, 401(k), paid leave, tuition reimbursement, and a variety of other discounts and perks. Learn more about the benefits offered by NBCUniversal by visiting the Benefits page of the Careers website.
Salary range: $175,947- $176,000 per year
Full-time: 40 hours/week
For LA County and City Residents Only: NBCUniversal will consider for employment
qualified applicants with criminal histories, or arrest or conviction records, in a manner
consistent with relevant legal requirements, including the City of Los Angeles' Fair Chance
Initiative For Hiring Ordinance, the Los Angeles' County Fair Chance Ordinance for Employers, and the California Fair Chance Act, where applicable.
Jobcode: Reference SBJ-vewe67-3-17-154-155-42 in your application.