> Jobs > DevOps / DataOps Engineer
10.09
2024

DevOps / DataOps Engineer

We are seeking an experienced and dedicated DevOps / DataOps Engineer to lead, design, and maintain the core IT cloud infrastructure of our forward-thinking organization.

As a key player in our team, you'll have the opportunity to shape and optimize our cloud environments, security protocols, and CI/CD pipelines, ensuring smooth and secure operations. You'll work with the latest cloud technologies and tools to build scalable solutions that drive business success.

Duties and Responsibilities:

  • Architect, develop, and maintain cloud infrastructure to support IT, Data and R&D projects related to data-decision making.

  • Lead efforts in automating and streamlining operations and processes, leveraging CI/CD to ensure rapid, consistent delivery of services.

  • Lead the design and development of the back-end data infrastructure interacting with our fleet of Dexter robots.

  • Design and manage data pipelines using data integration platform technologies ensuring efficient data flow and integration across platforms.

  • Collaborate with development teams to support seamless application deployment and monitoring.

  • Continuously improve and evolve our IT systems to stay ahead of industry trends and challenges.

  • Ensure robust cybersecurity practices such as SOC2 are in place, safeguarding our systems and data.

  • Stay current with industry trends, making recommendations for improvements and ensuring that the company leverages new technology effectively.

  • Update documentation, procedures, and work instructions.


Work Experience and Education requirements:

  • Bachelor's degree in computer science or equivalent, 7 to 10 years of experience.

  • Proven experience in similar position.


Knowledge, Skills & Abilities required:

  • Deep expertise in IT cloud infrastructure (Microsoft Azure), with a passion for implementing efficient and scalable solutions.

  • Extensive experience with CI/CD pipelines, automation, and modern DevOps tools like Azure DevOps, Jenkins, Docker, Kubernetes, Terraform.

  • Hands-on experience with Data Factory technologies (Databricks, Snowflake or equivalent) for creating, orchestrating, and managing data pipelines.

  • High proficiency in creating scripts in Python, PowerShell, or other languages to automate repetitive operations.

  • Proven track record in implementing data middleware using tools such as BOOMI, Talend.

  • Strong background in cybersecurity, ensuring the highest levels of protection across the organization.

  • Hands on experience with asset monitoring (e.g. telemetry and observability frameworks, IoT platforms).

  • Familiar with cloud business applications (CRM, ERP), Google Workspace, Box.

  • Excellent problem-solving skills and the ability to prioritize a diverse workload.

  • Flexible, agile mindset, service oriented, customer-focused, results-focused.

  • Eager to join a fast-growing company.

  • Fluent in English in mandatory, good level of French is preferred and German is plus.

  • Effective written and oral communication skills.


Benefits:

  • Opportunity to work in a fast-growing company active in one of the most promising fields of medicine.

  • A dynamic atmosphere in an internationally minded environment.Permanent contract with 25 days of paid vacation.

  • Benefit from one day to two days per week working remotely to boost flexibility.

  • Attractive talent development programs and initiatives empowering employees to enhance their skills and fostering professional development growth.

  • Mobility plan, competitive company pension plan, support for gym memberships.

  • Regular team building events such as boot camps, skiing and much more.


More information and application