Apply

Machine Learning Solutions Architect

Posted 2024-11-24

View full description

💎 Seniority level: Senior, At least 6 years

📍 Location: United States, Latin America, India

🔍 Industry: Data and Machine Learning Solutions

🗣️ Languages: English

⏳ Experience: At least 6 years

🪄 Skills: AWSDockerLeadershipPythonSoftware DevelopmentSQLDjangoFlaskGCPJavaKafkaKerasKubernetesMachine LearningMLFlowMySQLOracleQASAPSnowflakeSpringAzureData scienceRDBMSSparkTensorflowLinuxPresentation skillsDocumentation

Requirements:
  • At least 6 years experience as a Machine Learning Engineer, Software Engineer, or Data Engineer.
  • 4-year Bachelor's degree in Computer Science or a related field.
  • Experience deploying machine learning models in a production setting.
  • Expertise in Python, Scala, Java, or another modern programming language.
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries.
  • Hands-on experience in big data ecosystem products/languages such as Spark, Snowflake, Databricks.
  • Familiarity with multiple data sources including JMS, Kafka, RDBMS, DWH, MySQL, Oracle, SAP.
  • Systems-level knowledge in network/cloud architecture, operating systems, and storage systems.
  • Production experience in core data technologies such as Spark, HDFS, and Snowflake.
  • Experience in developing APIs and web server applications.
  • Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment.
  • Excellent communication and presentation skills.
Responsibilities:
  • Designing and implementing data solutions best suited to deliver on customer needs.
  • Providing thought leadership by recommending technologies and solutions.
  • Creating environments for data scientists to build models.
  • Extracting data from customer systems and placing it in analytical environments.
  • Defining deployment approaches and infrastructure for models.
  • Demonstrating the business value of data through data manipulation.
  • Ensuring model deployment aligns with business systems and can be maintained.
  • Creating operational testing strategies and validating models prior to deployment.
Apply

Related Jobs

Apply

📍 United States, Latin America, India

🔍 Data solutions and machine learning

  • At least 6 years experience as a Machine Learning Engineer, Software Engineer, or Data Engineer.
  • 4-year Bachelor's degree in Computer Science or a related field.
  • Experience deploying machine learning models in a production setting.
  • Expertise in Python, Scala, Java, or another modern programming language.
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries.
  • Hands-on experience with big data ecosystem products/languages such as Spark, Snowflake, Databricks.
  • Familiarity with multiple data sources and systems-level knowledge in network/cloud architecture.
  • Development of APIs and web server applications.
  • Complete software development lifecycle experience.
  • Excellent communication and presentation skills.

  • Design and create environments for data scientists to build models and manipulate data.
  • Provide thought leadership by recommending technologies and solution designs.
  • Define deployment approaches and infrastructure for models.
  • Work with data scientists to reveal data's business value.
  • Create operational testing strategies and ensure quality of the delivered product.

AWSLeadershipPythonSoftware DevelopmentSQLGCPJavaKafkaKerasKubernetesMachine LearningMLFlowMySQLOracleQASAPSnowflakeSpringAzureData scienceRDBMSSparkTensorflowLinuxPresentation skillsDocumentation

Posted 2024-11-24
Apply
Apply

📍 United States, Latin America, India

🔍 Data and Artificial Intelligence

  • At least 6 years experience as a Machine Learning Engineer, Software Engineer, or Data Engineer.
  • 4-year Bachelor's degree in Computer Science or a related field.
  • Experience deploying machine learning models in a production setting.
  • Expertise in Python, Scala, Java, or another modern programming language.
  • The ability to build and operate robust data pipelines using a variety of data sources, programming languages, and toolsets.
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries.
  • Hands-on experience in one or more big data ecosystem products/languages such as Spark, Snowflake, Databricks, etc.
  • Familiarity with multiple data sources (e.g. JMS, Kafka, RDBMS, DWH, MySQL, Oracle, SAP).
  • Systems-level knowledge in network/cloud architecture, operating systems (e.g., Linux), and storage systems (e.g., AWS, Databricks, Cloudera).
  • Production experience in core data technologies (e.g. Spark, HDFS, Snowflake, Databricks, Redshift, & Amazon EMR).
  • Development of APIs and web server applications (e.g. Flask, Django, Spring).
  • Complete software development lifecycle experience, including design, documentation, implementation, testing, and deployment.
  • Excellent communication and presentation skills; previous experience working with internal or external customers.

  • Designing and implementing data solutions best suited to deliver on our customer needs — from model inference, retraining, monitoring, and beyond — across an evolving technical stack.
  • Providing thought leadership by recommending the technologies and solution design for a given use case, from the application layer to infrastructure.
  • Create environments for data scientists to build models and manipulate data.
  • Work within customer systems to extract data and place it within an analytical environment.
  • Define the deployment approach and infrastructure for models and be responsible for ensuring that businesses can use the models we develop.
  • Demonstrate the business value of data by working with data scientists to manipulate and transform data into actionable insights.
  • Partner with data scientists to ensure solution deployability—at scale, in harmony with existing business systems and pipelines.
  • Create operational testing strategies, validate and test the model in QA, and implementation, testing, and deployment.
  • Ensure the quality of the delivered product.

AWSDockerLeadershipPythonSoftware DevelopmentSQLArtificial IntelligenceDjangoFlaskGCPJavaKafkaKerasKubernetesMachine LearningMLFlowMySQLOracleQASAPSnowflakeSpringAzureData scienceRDBMSSparkTensorflow

Posted 2024-10-03
Apply