oppot2unity

Oppot2unity is looking for

Data Scientist Machine Learning Engineer

Job Description
Your Responsibilities

• As data scientist you will translate business needs into technical solutions and develop advanced machine learning models for the analysis of large-scale and high-dimensional data. You will have direct impact on new and exciting AI solutions that will change
products and processes.
• You will design, develop and implement new, high-performance data science services together with your data science colleagues, which will help our colleagues to improve the experience of our customers.
• Generate ideas for AI-driven innovative products, digital services, and business models, and determine feasibility and value added in mixed teams with business & engineering experts.
• You are responsible for the implementation of our MLOps approach and bring in your experience in the context of software delivery processes.
• Work in innovative projects with our partners on topics such as continuous improvement and validation of AI models for e-mobility and automated driving.
• You will stay current with latest research and technology trends to build state of the art machine learning/deep learning models. You will review code written by others for critical feedback. You will be responsible for the modelling process from raw data to deployment ready model for production.
• Combine the strengths of information retrieval technologies with the power of modern machine learning models.
• Turn business requirements into working products – design, map, code, and refine your solutions.
• Support (embedded) SW developers in designing, building, deploying, testing, validating, and maintaining AI algorithms & ML models.
• Program in Java and Python, using a wide range of technologies and libraries such as Lucene, ElasticSearch, Maven, pandas, scikit, spacy, pyspark.
• Help improve existing products, analyze performance, and spot bugs and weaknesses.
• Deploy and maintain your services and machine learning models (MLOps).
• Coordinate work with other teams to make things happen (quality assurance, product management).
• You develop machine learning solutions and translate them into mircoservices. At the same time, you guarantee a high quality of service including the corresponding monitoring.
• Understanding of container technologies (such as Docker), and microservice architectures in the context of Big Data and experience in building CI/CD pipelines and Infrastructure as Code processes for cloud platforms (AWS or GC) is desirable
• You build data flows with relational and non-relational data using appropriate technologies in a hybrid cloud environment.

Must have:
• B.Sc., M.Sc. or Ph.D. in computer science or a related field.
• 4+ years of experience working as a Machine Learning Engineer, Data Engineer or in
a similar role with strong ML background.
• Experience in dealing with relational databases (especially SQL).
• Enthusiasm for the development and practical implementation of machine learning models in a productive environment.
• Experience building, deploying, and maintaining ML models in production.
• Experience with predictive modelling, machine learning and AI.
• Strong in programming languages like SQL, Python, R, and Scala
• Working knowledge of statistics, algebra and math.
• Experience with ML frameworks, python tools for data science and MLOps tools such as pytorch, TensorFlow, ModelDB, MLFlow and Kubeflow.
• Solid skills in programming with Python and ability to write efficient, well-tested code with a keen eye on maintainability.
• Experience with cloud based data pipelines optimized for Machine Learning applications, GCP is a plus.
• Proficient with Docker, Kubernetes, and Infrastructure as code (e.g. terraform).
• Excellent communication and teamwork skills.
• Familiar with workflow management platforms like Airflow or Argo Workflows.
• Familiar with best practices in the data engineering and MLOps community.
• You enjoy working in an agile team, like to get involved, think « out of the box » and are keen to tackle new topics.
• Knowledge at Google Cloud Platform, Terraform, Kubernetes, Docker, BigQuery, Apache Beam, Dataflow, Python, SQL, REST APIs, GraphQ.
• Basic knowledge at spark, Hadoop, and other Cloud-solutions. Knowledge in Data Modelling, DevOps, and agile SW development. Benefit: Cloudera Toolchain, Tableau, Spotfire, signal processing and data analysis of sensor data such as inertial or pressure sensor.
• English skills must be perfect, German skills are preferred.

Comment postuler :
Send your cv to this email : ›

Ville : Berlin Allemagnes
Nom / Entreprise : Oppot2unity
Email : ›
Tel / Fax : +49 (0) 30 38 307700
Adresse : Turmstr.21 Haus K, Eingang A 10559 Berlin Germany
Site Web : https://www.opport2unity.com/