Acquired by Lumen Technologies in 2019, Streamroot works to overcome one of the biggest challenges facing the internet today: the explosion of video traffic. Our goal is to redesign the way we deliver content online, to create more robust, cost-effective infrastructures, and to touch millions of internet users by bringing quality video to every corner of the world. Our content delivery technologies allow online content providers to improve performance for viewers by providing a real-time view of the conditions of each user device and thereby adapting video delivery to every viewer.
Founded in 2013, Streamroot pioneered device-side delivery technologies and quickly became a market leader for its mesh network and multi-CDN solutions. The company was backed by major venture funds and was acquired by the technology leader Lumen, which has placed our products at the center of an innovative portfolio of media delivery solutions alongside a global CDN and IP backbone.
Today, with Lumen’s global teams of nearly 45,000 employees, we are a division of 40 passionate engineers, business, and marketing professionals from 15 different nationalities, spread across offices in Paris, New York, and Denver. With an expanding customer base that includes media groups like Canal+, TF1, and France TV, we power billions of video sessions every year.
As a DATA Developer and System Engineer, your mission is to build, maintain and operate a robust and scalable Data pipeline solution on Public and Private Cloud. You are also responsible for both the quality and the operability of the Data Pipeline. You work closely with other Backend Engineers to align our processes running up to date with other teams. As a Data team member, you help to improve the efficiency of our product with occasional in-depth data investigations and some R&D subject.
- Maintain and operate all components of our data pipeline in production on a day-to-day basis on Public and Private Cloud. You will be responsible for the scalability and reliability of the pipeline.
- Improve our deployment processes to make them fit with our growing scale
- Automate recurring analysis procedures and reporting to improve the data science team’s productivity and efficiency
- Maintain and develop the (internal and external) APIs used to access our data
- Maintain and develop the tools to collect, transform and store our data
- As a Data team member, contribute to efficiency improvements with occasional in-depth data investigations and some R&D subject.
- Guarantee the consistency and availability of the data stored across all our databases
- Build a long term strategy as to how the data pipeline should evolve to meet our new challenges
- Degree in Computer Science
- Good knowledge of at least one backend development language is required (java, python, go, etc)
- Experience with big data and DevOps frameworks/tools (Kafka, Spark, Flink, docker / Kubernetes,ClickHouse, etc...) is required. Excellence in at least one of these is highly desirable.
- Experience in infra ( Terraform, Ansible).
- Excellent problem solving and analytical thinking skills
- Passion for quality and attention to detail
- Thrive in a collaborative environment
- Good English and great communication skills
- Transparent and open in communications with colleagues, managers, and subordinates, honesty and accountability, good listening skills, teamwork skills.
- A true international team with more than 10 different nationalities in our Paris office
- Office in the heart of Paris, 2ème arrondissement.
- An unparalleled learning experience: we’ll give you the tools, train you, and coach you so you’ll be able to work independently. You’ll be given full responsibility for your projects.
- A ground-level opportunity in a growth environment
- Starting ASAP in Paris, FR.