Terraform Jobs

Find remote positions requiring Terraform skills. Browse through opportunities where you can utilize your expertise and grow your career.

Terraform
547 jobs found. to receive daily emails with new job openings that match your preferences.
547 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

πŸ’Έ 110000 USD per year

πŸ” Healthcare technology solutions

🏒 Company: Acentra Health

  • Bachelor’s degree with 6–8 years or Master’s degree with 4–6 years of relevant work experience.
  • Proficiency in JIRA, Confluence, and version control tools.
  • Extensive knowledge of Linux and AWS DevOps tools.
  • Familiarity with scripting languages and automation tools.
  • AWS certification and advanced Linux skills are preferred.

  • Functions as an individual contributor under minimal supervision.
  • Demonstrates expertise in AWS services and applies knowledge of Kubernetes.
  • Utilizes IAC tools for automated infrastructure provisioning.
  • Collaborates to maintain DevOps processes and best practices.
  • Maintains CI/CD pipelines and performs code reviews.
  • Participates in process improvement initiatives.

AWSDockerPythonBashGitJenkinsKubernetesJiraAzureGroovyMavenRDBMSCollaborationCI/CDLinuxDevOpsTerraformMicroservices

Posted 2024-11-21
Apply
Apply

πŸ“ Germany, Austria

πŸ” Financial Technology

  • Excellent university degree in computer science, mathematics, natural sciences, or a similar field.
  • Knowledge in econometrics with an emphasis on portfolio optimization and risk modelling.
  • Experience with convex optimization and exposure to libraries like cvxpy, scipy or cvxopt.
  • Exposure to tech stack: Python, Docker, CI/CD, Infrastructure as Code (Terraform), SQL.
  • Experience with cloud providers like AWS.
  • Knowledge of software development/design in Python and relational databases/SQL.
  • Fluent in English with strong communication skills.
  • Proactive and independent working style.

  • Develop and fully automate trading algorithms focusing on risk, cost, tax optimization.
  • Productionize algorithmic models that respond to client requests and market movements.
  • Create high-performance risk management tools and interactive dashboards.
  • Architect interfaces connecting services and generate data-driven business value.
  • Completely automate algorithms that manage billions in assets.
  • Productionize econometric models for portfolio management.

DockerPythonSoftware DevelopmentAgileMachine LearningAlgorithmsCommunication SkillsCI/CDTerraformTime Management

Posted 2024-11-21
Apply
Apply

πŸ“ Mexico, Gibraltar, Colombia, USA, Brazil, Argentina

🧭 Full-Time

πŸ” Cryptocurrency

🏒 Company: Bitso

  • 4+ years of professional experience working with analytics, ETLs, and data systems as an individual contributor.
  • 3+ years of experience in engineering management at tech companies.
  • Expertise in defining and implementing data architectures, including ETL/ELT pipelines, data lakes, data warehouses, and real-time data processing systems.
  • Expertise with cloud platforms (AWS preferred), data engineering tools (Databricks, Spark, Kafka), and SQL/NoSQL databases.
  • Expertise translating business requirements into technical solutions and data architecture.
  • Expertise with orchestration tools (e.g. AWS step functions, Databricks workflows, or Dagster).
  • Proven experience in building data migration services or implementing change data capture (CDC) processes.
  • Experience with CI/CD tools (Github actions).
  • Experience with CDP platforms and handling behavioral data (e.g. Segment, Amplitude, AVO).
  • Experience in infrastructure as code technologies (e.g. terraform) and serverless for data engineering tasks.

  • Lead the Data Engineering team and Data Governance lead on daily tasks with technical expertise and mentoring.
  • Prioritize workload, set clear goals and drive accountability to ensure the team delivers exceptional data products in a timely manner.
  • Mentor and coach all the Data Engineering division; fostering their professional development and an innovation culture.
  • Partner with Data Science divisions to drive data products that solve business problems.
  • Engage with stakeholders to define roadmaps according to Bitso’s priorities.
  • Recruit and retain top talent.
  • Define and drive Bitso’s data strategy in partnership with the SVP of Data Science.

AWSLeadershipSQLBusiness IntelligenceETLKafkaStrategyData engineeringData scienceServerlessNosqlSparkCollaborationCI/CDMentoringDevOpsTerraform

Posted 2024-11-21
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 95300 - 113600 USD per year

πŸ” Cloud networking

  • 4+ years in a technical account management position.
  • Strong background in technical cloud consulting and/or customer support preferably with Cloud Service Providers and/or SaaS vendors.
  • Deep understanding of cloud networking and security.
  • Proficiency in network configuration, performance optimization, and health analysis.
  • Knowledge of network automation tools like Terraform.
  • Demonstrated ability to lead change management processes.
  • Excellent communication skills and strong relationship-building skills.

  • Provide best practice guidance for architecture stability.
  • Conduct performance analysis and present evaluations to customers.
  • Champion customer requirements within Aviatrix.
  • Serve as the primary contact for clients, providing strategic guidance.
  • Guide proactive and reactive support for customer platform health.

StrategyCommunication SkillsCollaborationNegotiationTerraformDocumentation

Posted 2024-11-21
Apply
Apply

πŸ“ UK, India, Germany

πŸ” Fintech

🏒 Company: Careers at Tide

  • Have some experience building server-side applications.
  • Sound knowledge of a backend framework, such as Spring/Spring Boot for microservices.
  • Experience in engineering scalable solutions in cloud-native environments.
  • Understanding of CI/CD and practical Agile concepts.
  • Demonstrate a mindset of delivering secure, well-tested, and well-documented software.

  • Contribute to our event-driven Microservice Architecture (currently 200+ services owned by 40+ teams).
  • Define and maintain the services your team owns, from design to global scaling.
  • Use Java 17, Spring Boot, and JOOQ for building services.
  • Expose and consume RESTful APIs, treating them as products.
  • Use SNS, SQS, and Kafka for event handling.
  • Utilize PostgreSQL via Aurora for data storage.
  • Deploy services to Production as needed, utilizing CI/CD pipelines with GitHub.
  • Experience modern GitOps using ArgoCD and Docker.
  • Monitor services with DataDog.
  • Collaborate with Product Owners on user needs and requirements.

AWSDockerPostgreSQLAgileJUNITKafkaKubernetesSpringSpring BootCI/CDRESTful APIsTerraformMicroservices

Posted 2024-11-21
Apply
Apply

πŸ“ Canada

πŸ” Insurance

NOT STATED

  • Be responsible for building infrastructure needed for PolicyMe's operations.
  • Support the existing infrastructure and enhance it as required.
  • Contribute to the continued growth and development of the company.

AWSDockerBashCloud ComputingCybersecurityGitKubernetesNginx*NixAmazon Web ServicesREST APICI/CDRESTful APIsLinuxDevOpsTerraform

Posted 2024-11-21
Apply
Apply

πŸ“ Belgium, Spain

πŸ” Hospitality industry

🏒 Company: Lighthouse

  • 5+ years of professional experience using Python, Java, or Scala for data processing (Python preferred)
  • Experience with writing data processing pipelines and with cloud platforms like AWS, GCP, or Azure
  • Experience with data pipeline orchestration tools like Apache Airflow (preferred), Dagster or Prefect
  • Deep understanding of data warehousing strategies
  • Experience with transformation tools like dbt to manage data transformation in your data pipelines
  • Some experience in managing infrastructure with IaC tools like Terraform
  • Stay updated with industry trends, emerging technologies, and best practices in data engineering
  • Improve, manage, and teach standards for code maintainability and performance in code submitted and reviewed
  • Ship large features independently, generate architecture recommendations with the ability to implement them
  • Strong communicator that can describe complex topics in a simple way to a variety of technical and non-technical stakeholders.

  • Design and develop scalable, reliable data pipelines using the Google Cloud stack.
  • Ingest, process, and store structured and unstructured data from various sources into our data-lakes and data warehouses.
  • Optimise data pipelines for cost, performance and scalability.
  • Implement and maintain data governance frameworks, ensuring data accuracy, consistency, and compliance.
  • Monitor and troubleshoot data pipeline issues, implementing proactive measures for reliability and performance.
  • Mentor and provide technical guidance to other engineers working with data.
  • Partner with Product, Engineering & Data Science teams to operationalise new solutions.

PythonApache AirflowGCPJavaKafkaKubernetesAirflowData engineeringGrafanaPrometheusSparkCI/CDTerraformDocumentationCompliance

Posted 2024-11-21
Apply
Apply

πŸ“ United States

πŸ” Data and technology

  • 5+ years experience in security engineering or site reliability engineering.
  • Excellent Terraform skills required.
  • Experience working with and developing CI/CD pipelines for Infrastructure as Code required.
  • Knowledge of programming/scripting fundamentals (python/golang) required.
  • Expertise in performing ETL onboarding for diverse log feed technologies required.
  • Experience supporting a Splunk platform administration, new content dashboards, applications, and use cases.
  • Hands-on experience developing Rest API's to capture data from external sources.
  • Experience with Agile methodologies.
  • Understanding of multiple log formats and source data for SIEM Analysis.
  • Solid background with Windows and Linux platforms (security or system administration).
  • Experience with technical concepts including networking and several cyber attacks.

  • Understand data feeds of multiple security tools and logs that feed the SIEM & UEBA technologies.
  • Identify capabilities and quality of these feeds and recommend improvements.
  • Create new content use cases based on threat intelligence, analyst feedback, available log data, and previous incidents.
  • Perform daily activities of the content life cycle including creating, testing, tuning, and maintaining associated documentation.
  • Improve vulnerabilities across different application environments.
  • Work with other security teams and product SMEs to identify capability gaps.
  • Develop parsers and field extractions to support content development.
  • Develop custom scripts to enhance default SIEM functionality.
  • Participate in root cause analysis on security incidents and provide recommendations for new data sources and enrichment.

PythonAgileETLGolangREST APICI/CDLinuxTerraformDocumentation

Posted 2024-11-21
Apply
Apply

πŸ“ India

πŸ” Fare payment and public transport

  • Strong core Java/Kotlin and object-oriented programming skills.
  • Experience in implementing high-quality, high-performance software.
  • Strong native Android development experience.
  • Understanding of architectural patterns used to build highly available services.
  • Interest in working with Linux-based IoT devices.
  • Collaborative mindset and teamwork skills.
  • Adaptable approach for different types of software environments.

  • Contribute to maintaining existing devices.
  • Scale the Validator network.
  • Ensure high performance and reliability for ticket validation.
  • Monitor and manage devices in the field.
  • Work on products that are highly visible to customers.

AWSDockerAndroidDynamoDBIoTJavaKotlinServerlessLinuxTerraform

Posted 2024-11-21
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-11-20

πŸ“ Argentina, Spain, England, United Kingdom, Lisbon, Portugal

🧭 Full-Time

πŸ” Web3

🏒 Company: Reown

  • 5+ years working in the analytics stack within a fast-paced environment.
  • 3+ years production experience with SQL templating engines like DBT.
  • Experience with distributed query engines (Bigquery, Athena, Spark), data warehouses, and BI tools.
  • Strong understanding of software engineering principles, coding standards, design patterns, version control (e.g., Git), testing methodologies, and CI/CD processes.
  • Experience with AWS/GCP/Azure services for deployment and management.
  • Familiarity with GitHub, CI/CD pipelines, GitHub Actions, and Terraform.
  • Ability to write Python scripts for ETL processes and data manipulation.
  • Proficient with libraries like pandas for analysis and transformation.
  • Experience handling various data formats (e.g., CSV, JSON, Parquet).
  • Strong problem-solving skills and communication abilities to discuss technical concepts.

  • Write complex SQL queries that extract and combine data from on-chain and off-chain logs for analytics.
  • Create dashboards and tools for team data discoverability and KR tracking.
  • Perform deep-dive analyses into specific topics for internal stakeholders.
  • Help design, implement, and evolve Reown's on-chain data infrastructure.
  • Build, maintain, and monitor end-to-end data pipelines for new datasets and features.
  • Write health-checks and alerts to ensure data correctness, consistency, and freshness.
  • Meet with product managers and stakeholders to understand data needs and detect new product opportunities.

AWSPythonSQLData AnalysisDesign PatternsETLGCPGitTableauAzureClickhouseData analysisPandasSparkCommunication SkillsCI/CDTerraformWritten communication

Posted 2024-11-20
Apply
Shown 10 out of 547