We are looking for a talented and passionate Data Engineer able to operate in a changing environment and seeking out ways to add value to the organisation.
The role of data engineer is to assist data scientists and head of analytics in data extraction, ingestion, partitioning, transformation, loading and querying for analytics purposes to support delivering actionable insights relying on a highly performing Big data ecosystem.
· Lead on digital and TV raw/aggregated data ingestion, understanding and management with major focus on audiences (ie. subscribers, users, viewers), content (ie. Genre, program) and technology usage (ie. apps, VOD, web, social media),
· Assist the Implementation of valuable metrics and data analytics techniques (ie. digital attribution, time series, descriptive modelling, lifetime value, churn propensity, behavioural segments) in Amazon Web services Big data platform to boost TV and overall digital business performance.
· Work on large raw databases and aggregated individual level audience/subscriber’s datasets from Linear TV, digital, VOD, SVOD and App platforms.
· Lead on data cleansing, health check and validation on top of Redshift tables (DWH) for reporting/analytics purposes using SQL workbench, Python, Map reduce and Lambda.
· Support data scientists and head of Analytics with data engineering related tasks and data Processing for root cause analysis of changes in Audience or subscriber’s Behavioral patterns.
· Support data scientists and head of Analytics in building the right data warehouse table and calculating the right /KPIs for data Products coming from:
o Return Path Data
o Mobile apps related Data
o Web Related Data (google analytics, google Big Query)
o Social Media Web Related Data Click stream
o Audience TV / Radio data
· Assist in data re-structuring/normalization/merging/processing in both Fast Transaction MBC Group data assets or external primary research data such as: Brand Imagery trackers, Segmentation, U&A etc…
· Assist in developing applications, simulators and macros.
· Lead on running regularly quality checks of data.
· Act the single source and point of contact for data related matters.
· Work closely with Techops and Big data architect for future implementations.
· 2-4 years of experience in similar role
· Holder of a Bachelor’s Degree in Engineering, Actuarial Sciences, Computer Science, Information technology or related discipline from a reputable University
· Solid experience with digital data universe (ie. Web, Google, adobe)
· At least 2 years of experience with Big data platform.
· Strong knowledge and experience using Big Data engineering tools and languages (ie. Python, Lambda, SQL Workbench, Map reduce, SPARK, Lambda)
· Understanding of standard digital technologies used for commercial VOD, ecommerce and subscription services including cookies, beaconing, working with tag management data and data layers or SDKs and ad-serving technologies.
· Good Knowledge with data visualization software (Tableau, Business objects, Qlick, Microsoft) is a plus.
· Fluency English is a must
· Deep understanding of structuring and querying data using Structured Query Language (SQL), Python, Lambda and Map reduce.
· Knowledge in Data Blending of multi-sourced data set in Amazon web services environment.
· Knowledge in Excel Macros / VBA Scripting.
· Basic or Advanced skills in R, SPARK and SPSS (Statistical Package for social Sciences) is a plus.
· Good knowledge of Big Data querying tools, such as HADOOP, Pig, Hive, and Impala is a plus.
· Ability to Write ETLs for new Enterprises, Databases, Apps, Macros using SQL SSIS.