Data Architect
GroupM Data & Technology
London
a month ago

At the heart of the world’s leading media agency network is a future-facing product company, building the tools to make media work for everyone. In partnership with the globe’s leading clients, agency teams, media companies and technology platforms, we’re using our privileged position to help our customers ascend to vantage points unique in our industry.

Our teams bring together agile product management, cutting edge data science and enterprise scale engineering to build products that will shape the next decade of data driven marketing. We believe consumer privacy, client confidentiality, brand growth and user experience are essential to performance and the sustainability of the advertising ecosystem and have assembled a global team with diverse skills and experience to help shape that future.

Key Responsibilities:

  • You will be accountable for the technical design of our data pipelines and must ensure that our end-to-end architecture both scales and evolves in line with the use cases they support.
  • You will partner with the business, working closely with product leadership to ensure our systems are fit for purpose through a nuanced understanding of the business context.
  • Defining data architecture best practice in terms of processing data at scale and testing data systems, nuanced to our business context
  • Ensuring the technical delivery of our data systems are fit for purpose
  • Providing designs and solutions for Proteus (our in-house data system) that allow the development teams to deliver software on time and to agreed functional and non-functional specification
  • Providing technical leadership in order to make sure that the designs are implemented using appropriate tools and techniques
  • Managing technical debt across projects; making the right calls between balancing pragmatic delivery and compromising implementation patterns
  • Helping guard our SRE practice and disseminating key processes
  • Reporting any architectural, implementation and infrastructure issues that prevent the teams from delivering software; advise on changes needed to remove the root causes of the issues
  • Performing hands-on development where needed, during build phases of critical projects
  • Experimenting and choosing new technologies to fit into our ecosystem - be an expert authority on data architecture & engineering tools, techniques and patterns
  • Attend various meetings, technical reviews and delivery activities
  • Liaise with senior business stakeholders throughout the software development life cycle (SDLC)
  • Lead technical interviews
  • You will be hands-on and expected to join the teams when needed. Lead by example.

Desirable Experience:

You have a proven track record of:

  • Overseeing complex technical delivery of a team of architects/tech leads - ideally a mix of both onshore and offshore resources
  • Operating in the big data stack (Spark, Scala, Kafka, Delta Lake [preferred])
  • Designing and managing the successful delivery of cloud-native data pipeline applications at scale
  • Significant experience in the SDLC for a major project
  • Discerning the nuances of the business context that should impact on system design and applying the right solution to the problem
  • Designing a shared-service used by multiple consumers
  • Confident in making decisions around interfaces design and whether functionality should be inside or outside a service's boundary
  • Experience breaking down a large production application into smaller services

Desirable Skills:

You are:

  • Proficient in Java
  • Proficient in SQL
  • Experienced across AWS (EMR), GCP (DataProc, Big Query)
  • Experienced in the development workflow
  • Highly detail-oriented
  • Focussed on delivery
  • Experienced in the ELT paradigm and have worked with a range of tools (code implementations and frameworks alike) from data ingestion and processing to data modeling and visualization.

You know:

  • Modern big data architecture
  • Microservice architecture patterns
  • API implementation patterns and interfaces
  • Big data file formats (AVRO, ORC, PARQUET) and when to use which
  • Ideally, you:
  • Have extensive experience within the advertising industry and associated 3rd party datasets

About GroupM
GroupM is the world’s leading media investment company responsible for more than $113B in annual media investment through agencies including Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the outcomes-driven programmatic audience company, Xaxis. GroupM creates competitive advantage for advertisers via its worldwide organization of media experts who deliver powerful insights on consumers and media platforms, trading expertise, market-leading brand-safe media, technology solutions, addressable TV, content, sports and more.

Discover more about GroupM at www.groupm.com
Follow GroupM on LinkedIn - https://www.linkedin.com/company/groupm

GroupM embraces and celebrates diversity, inclusivity, and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We are a worldwide media agency network that represents global clients. The more inclusive we are, the more great work we can create together.