ETL Job Salaries

Find salary information for remote positions requiring ETL skills. Make data-driven decisions about your career path.

ETL

Median high-range salary for jobs requiring ETL:

$180,090

This analysis is based on salary ranges collected from 162 job descriptions that match the search and allow working remotely. Choose a country to narrow down the search and view statistics exclusively for remote jobs available in that location.

The Median Salary Range is $140,000 - $180,090

  • 25% of job descriptions advertised a maximum salary above $213,000.
  • 5% of job descriptions advertised a maximum salary above $272,000.

Skills and Salary

Specific skills can have a substantial impact on salary ranges for jobs that align with these search preferences. Certain in-demand skills are highly valued by employers and can significantly boost compensation. These skills often reflect the unique requirements and challenges faced by professionals in these roles. Some of the most sought-after skills that correlate with higher salaries include Machine Learning, Data engineering and Python. Mastering these skills can demonstrate expertise and make individuals more competitive in the job market. Employers often prioritize candidates who possess these skills, as they can contribute directly to the organization's success. The ability to effectively utilize these skills can lead to increased earning potential and career advancement opportunities.

  1. Machine Learning

    26% jobs mention Machine Learning as a required skill. The Median Salary Range for these jobs is $159,900 - $206,982.5

    • 25% of job descriptions advertised a maximum salary above $235,000.
    • 5% of job descriptions advertised a maximum salary above $301,560.
  2. Data engineering

    48% jobs mention Data engineering as a required skill. The Median Salary Range for these jobs is $150,000 - $198,550

    • 25% of job descriptions advertised a maximum salary above $221,600.
    • 5% of job descriptions advertised a maximum salary above $273,000.
  3. Python

    67% jobs mention Python as a required skill. The Median Salary Range for these jobs is $148,750 - $190,000

    • 25% of job descriptions advertised a maximum salary above $216,500.
    • 5% of job descriptions advertised a maximum salary above $270,500.
  4. AWS

    40% jobs mention AWS as a required skill. The Median Salary Range for these jobs is $150,000 - $190,000

    • 25% of job descriptions advertised a maximum salary above $227,750.
    • 5% of job descriptions advertised a maximum salary above $287,260.
  5. SQL

    87% jobs mention SQL as a required skill. The Median Salary Range for these jobs is $138,000 - $180,000

    • 25% of job descriptions advertised a maximum salary above $210,000.
    • 5% of job descriptions advertised a maximum salary above $261,750.
  6. Data modeling

    52% jobs mention Data modeling as a required skill. The Median Salary Range for these jobs is $130,147.5 - $180,000

    • 25% of job descriptions advertised a maximum salary above $211,500.
    • 5% of job descriptions advertised a maximum salary above $256,677.
  7. Data visualization

    40% jobs mention Data visualization as a required skill. The Median Salary Range for these jobs is $138,000 - $180,000

    • 25% of job descriptions advertised a maximum salary above $207,750.
    • 5% of job descriptions advertised a maximum salary above $258,750.
  8. Data Analysis

    33% jobs mention Data Analysis as a required skill. The Median Salary Range for these jobs is $125,000 - $176,880

    • 25% of job descriptions advertised a maximum salary above $223,200.
    • 5% of job descriptions advertised a maximum salary above $267,750.
  9. Snowflake

    30% jobs mention Snowflake as a required skill. The Median Salary Range for these jobs is $130,707 - $172,000

    • 25% of job descriptions advertised a maximum salary above $210,000.
    • 5% of job descriptions advertised a maximum salary above $276,800.

Industries and Salary

Industry plays a crucial role in determining salary ranges for jobs that align with these search preferences. Certain industries offer significantly higher compensation packages compared to others. Some in-demand industries known for their competitive salaries in these roles include Blockchain intelligence and financial services, Health tech and Fintech. These industries often have a strong demand for skilled professionals and are willing to invest in talent to meet their growth objectives. Factors such as industry size, profitability, and market trends can influence salary levels within these sectors. It's important to consider industry-specific factors when evaluating potential career paths and salary expectations.

  1. Blockchain intelligence and financial services

    1% jobs are in Blockchain intelligence and financial services industry. The Median Salary Range for these jobs is $220,000 - $262,500

    • 25% of job descriptions advertised a maximum salary above $270,000.
  2. Health tech

    2% jobs are in Health tech industry. The Median Salary Range for these jobs is $200,000 - $250,000

    • 25% of job descriptions advertised a maximum salary above $325,000.
    • 5% of job descriptions advertised a maximum salary above $350,000.
  3. Fintech

    2% jobs are in Fintech industry. The Median Salary Range for these jobs is $177,000 - $227,500

    • 25% of job descriptions advertised a maximum salary above $248,125.
    • 5% of job descriptions advertised a maximum salary above $255,000.
  4. Cybersecurity

    2% jobs are in Cybersecurity industry. The Median Salary Range for these jobs is $163,000 - $200,000

    • 25% of job descriptions advertised a maximum salary above $205,250.
    • 5% of job descriptions advertised a maximum salary above $207,000.
  5. Software Development

    18% jobs are in Software Development industry. The Median Salary Range for these jobs is $144,144 - $190,000

    • 25% of job descriptions advertised a maximum salary above $228,050.
    • 5% of job descriptions advertised a maximum salary above $250,250.
  6. Healthcare

    7% jobs are in Healthcare industry. The Median Salary Range for these jobs is $140,000 - $185,000

    • 25% of job descriptions advertised a maximum salary above $202,500.
    • 5% of job descriptions advertised a maximum salary above $257,910.5.
  7. Mental Health

    2% jobs are in Mental Health industry. The Median Salary Range for these jobs is $126,650 - $179,000

    • 25% of job descriptions advertised a maximum salary above $192,575.
    • 5% of job descriptions advertised a maximum salary above $197,100.
  8. Data Engineering

    3% jobs are in Data Engineering industry. The Median Salary Range for these jobs is $126,100 - $168,150

    • 25% of job descriptions advertised a maximum salary above $177,500.
    • 5% of job descriptions advertised a maximum salary above $200,000.
  9. EdTech

    2% jobs are in EdTech industry. The Median Salary Range for these jobs is $145,000 - $155,000

    • 25% of job descriptions advertised a maximum salary above $158,750.
    • 5% of job descriptions advertised a maximum salary above $160,000.
  10. Data Analytics

    2% jobs are in Data Analytics industry. The Median Salary Range for these jobs is $80,000 - $100,000

    • 25% of job descriptions advertised a maximum salary above $182,500.
    • 5% of job descriptions advertised a maximum salary above $210,000.

Disclaimer: This analysis is based on salary ranges advertised in job descriptions found on Remoote.app. While it provides valuable insights into potential compensation, it's important to understand that advertised salary ranges may not always reflect the actual salaries paid to employees. Furthermore, not all companies disclose salary ranges, which can impact the accuracy of this analysis. Several factors can influence the final compensation package, including:

  • Negotiation: Salary ranges often serve as a starting point for negotiation. Your experience, skills, and qualifications can influence the final offer you receive.
  • Benefits: Salaries are just one component of total compensation. Some companies may offer competitive benefits packages that include health insurance, paid time off, retirement plans, and other perks. The value of these benefits can significantly affect your overall compensation.
  • Cost of Living: The cost of living in a particular location can impact salary expectations. Some areas may require higher salaries to maintain a similar standard of living compared to others.

Jobs

183 jobs found. to receive daily emails with new job openings that match your preferences.
183 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

πŸ’Έ 138000.0 - 238050.0 USD per year

πŸ” Life Science

  • Minimum Bachelor's degree in Cybersecurity, Data analytics, Information Systems, or a related field.
  • A minimum of 8 years of demonstrable experience in cyber metrics and reporting, business analytics, requirements analysis, forecasting, industry research, planning, and/or management consulting required to enable data storytelling to enable visualization
  • Experience in SQL data modeling, SQL query development and understanding of ETL is required.
  • Understanding of designing and developing cybersecurity dashboards using tools like Tableau, Power BI, or similar.
  • Strong experience with requirements analysis and testing
  • Awareness of relevant industry trends and standard processes in cybersecurity metrics and reporting.
  • Ability to collect, analyze, and interpret large datasets to derive relevant insights and trends.
  • Ability to effectively collect and communicate complex information to customers.
  • Proactive and self-motivated approach to work, with excellent problem-solving and analytical skills.
  • Superb communication and storytelling skills, with the ability to shape messages for various audiences; including insights to executive-level audiences.
  • Analytical approach, with the ability to analyze data and identify insights.
  • Strong collaboration and customer management skills with a track record of successful delivery across multi-functional teams.
  • Ability to work effectively at all levels of the organization, from executive committee to individual employees.
  • Demonstrated Project Management, Scrum, Agile skills is required
  • Shape, develop and implement an effective cybersecurity metrics & reporting requirements capability
  • Partner with customers to define report requirements, ensuring clarity and feasibility
  • Documents detailed reporting specifications, including data sources, metrics, and presentation format.
  • Validates report designs, data, and developed reports against business requirements.
  • Conducts detailed testing on reports, focusing on data preciseness, visual integrity, and usability.
  • Works with visualization and data & analytics teams to resolve any identified issues.
  • Conduct data discovery tied to requirements, including exploratory data analysis, data quality assessment.
  • Analyze requirements and data sources to identify patterns, key insights, action items, and storytelling potential.
  • Translate data into value while assessing effective data visualization, experience design and trending analytics linked to business and cybersecurity objectives.
  • Describe the key elements of a successful data story: knowing one’s audience, defining a goal, maintaining engagement, and being explicit about the takeaways.
  • Measure how a data story achieves its intended goals and where it can be optimized.
  • Identify, design and implement requirements to inform reporting, dashboards and strategy.
  • Identify and create metrics to measure performance through automated dashboards.

SQLBusiness IntelligenceCybersecurityData AnalysisETLTableauData engineeringData visualizationData modelingData analytics

Posted about 6 hours ago
Apply
Apply

πŸ“ United States

πŸ’Έ 123000.0 - 235900.0 USD per year

πŸ” Data/AI

🏒 Company: DatabricksπŸ‘₯ 1001-5000πŸ’° $684,559,082 Series I over 1 year agoArtificial Intelligence (AI)Machine LearningAnalyticsInformation Technology

  • 7+ years of experience in technical pre-sales, technical enablement, technical program management, or consulting with a focus on data, AI, or cloud technologies.
  • Experience building, delivering, and scaling technical enablement programs for highly skilled technical teams.
  • Proven ability to create, manage, and execute large-scale enablement programs, balancing technical rigor with structured program management.
  • Exceptional communication and presentation skills, with the ability to engage technical and executive audiences.
  • Strong stakeholder management and collaboration skills, with the ability to align multiple teams toward a common goal.
  • Experience in technical pre-sales roles, building proofs-of-concept, or implementing technical solutions for customers (Preferred)
  • Databricks certification or experience with Apache Sparkβ„’, MLflow, Delta Lake, and other open-source technologies (Preferred)
  • Design, implement, and scale enablement solutions that foster domain specialization, hands-on expertise, and technical mastery.
  • Introduce innovative multi-signal validation methods that assess expertise through real-world application and structured learning.
  • Facilitate enablement sessions, workshops, and hands-on activities that reinforce applied problem-solving and deep technical skills.
  • Develop and maintain technical content, including reference architectures, solution guides, and POC templates.
  • Measure impact and iterate on enablement programs, leveraging feedback and performance data to drive improvements.
  • Collaborate with technical field teams, enablement leaders, and stakeholders to continuously refine and scale high-impact training programs.
  • Drive adoption of enablement programs and strategies among senior leaders by proposing solutions that align with business priorities, address key challenges, and incorporate industry trends.

AWSProject ManagementPythonSQLCloud ComputingData AnalysisETLGCPMachine LearningMLFlowApache KafkaAzureData engineeringREST APICommunication SkillsCollaborationCI/CDProblem SolvingMentoringPresentation skillsTrainingData visualizationStakeholder managementStrategic thinkingData modelingCustomer Success

Posted about 11 hours ago
Apply
Apply

πŸ“ Albania, Bosnia and Herzegovina, Croatia, Czech Republic, Estonia, Greece, Georgia, Kosovo, Latvia, Lithuania, Moldova, Montenegro, North Macedonia, Poland, Portugal, Romania, Slovakia, Malta, Slovenia, Serbia, Cyprus, Bulgaria, Hungary, Netherlands, the United Kingdom, and South Africa

🧭 Full-Time

πŸ’Έ 1800.0 - 2800.0 EUR per month

πŸ” E-Commerce

🏒 Company: Foxelli GroupπŸ‘₯ 51-100Digital MarketingE-CommerceMarketing

  • Strong analytical skills with experience in e-commerce data analysis, including customer behavior, sales, inventory, and website traffic.
  • Expertise in developing customer journey maps and reporting KPIs and business trends using tools like Google Analytics, Power BI, or Tableau.
  • Proficient in predictive modeling and optimizing marketing campaigns, inventory, and website layouts based on data-driven insights.
  • Experience with A/B testing, conversion rate optimization, and collaborating with UX/UI teams to enhance user experience.
  • Familiarity with data privacy laws and best practices for data governance and security.
  • Skilled in setting up automated performance monitoring and alerts for site issues or metric shifts.
  • Ability to stay updated with the latest industry trends and continuously improve data analysis processes and tools.
  • Analyze data to identify trends, patterns, and insights related to customer behavior, sales, inventory, and website traffic.
  • Develop customer journey maps for various segments, illustrating paths from initial contact to final purchase.
  • Create reports and dashboards highlighting KPIs, insights, and business trends.
  • Define and track necessary analytics, providing actionable recommendations for team success.
  • Develop predictive models to forecast sales and customer behavior for optimizing inventory, marketing, and product strategies.
  • Optimize marketing campaigns and website layouts using data-driven insights.
  • Conduct A/B testing to enhance website conversion rates and customer engagement.
  • Collaborate with UX/UI teams to implement data-driven improvements for a smoother customer journey.
  • Leverage insights to optimize marketing channels and increase engagement and conversion rates.
  • Monitor the impact of changes and continuously test to improve user experience and outcomes.
  • Ensure compliance with data privacy laws and enforce data governance and security policies.
  • Stay updated with industry trends and explore new tools and sources to enhance data analysis capabilities.
  • Monitor website performance daily and set up automated alerts for any significant issues.
  • Provide rapid response analyses and recommendations to address shifts or problems in site performance.

SQLBusiness IntelligenceData AnalysisData MiningETLGoogle AnalyticsTableauREST APICommunication SkillsAnalytical SkillsProblem SolvingJSONData visualizationData modelingA/B testing

Posted 1 day ago
Apply
Apply

πŸ“ US

🧭 Full-Time

πŸ’Έ 115000.0 - 184000.0 USD per year

πŸ” Software Development

🏒 Company: ToastπŸ‘₯ 51-100Location Based ServicesInternetInformation Technology

  • 5+ years experience with Zuora Revenue (RevPro) developing and configuring features and solutions
  • Deep knowledge of 606 Revenue Recognition solutions and revenue accounting policies required for 606 compliance
  • 2+ years in Zuora Revenue implementation experience
  • Ability to have confident techno-functional conversations with cross functional teams across departments, advising and addressing concerns about finance and accounting concepts
  • Extensive knowledge of internal controls best practices and SOX Compliance
  • Ability to learn new technologies as they become prevalent and widely implemented, decision-making, time management, and task prioritization
  • Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or equivalent.
  • Own the overall configuration for the Zuora Revenue platform
  • Collaborate with product management and technical revenue leads to ensure balance between short- and long-term goals for the Financial Systems in a rapidly growing organization
  • Play a key role in owning and improving SOPs and monitoring best practices for Zuora Revenue
  • Partner with Revenue Technical Leads in the recommended evolution path for Zuora Revenue to meet the scale of our business
  • Define standards and processes to support and facilitate revenue financial systems for Toast
  • Work with SOX and compliance teams to ensure all controls are satisfied for audit
  • Triage and identify bug fixes required for Zuora Revenue, while working with 3rd party support
  • Own RevPro RTB (Run The Business) for the monthly close
  • Participate in all stages of the SDLC, from QA to UAT, for Zuora Revenue related fixes and upstream system impacts

SQLETLAPI testingREST APIAccountingComplianceFinancial analysisData modeling

Posted 1 day ago
Apply
Apply
πŸ”₯ Data Engineer
Posted 1 day ago

πŸ“ United States

πŸ’Έ 112800.0 - 126900.0 USD per year

πŸ” Software Development

🏒 Company: Titan Cloud

  • 4+ years of work experience with ETL, Data Modeling, Data Analysis, and Data Architecture.
  • Experience operating very large data warehouses or data lakes.
  • Experience with building data pipelines and applications to stream and process datasets at low latencies.
  • MySQL, MSSQL Database, Postgres, Python
  • Design, implement, and maintain standardized data models that align with business needs and analytical use cases.
  • Optimize data structures and schemas for efficient querying, scalability, and performance across various storage and compute platforms.
  • Provide guidance and best practices for data storage, partitioning, indexing, and query optimization.
  • Developing and maintaining a data pipeline design.
  • Build robust and scalable ETL/ELT data pipelines to transform raw data into structured datasets optimized for analysis.
  • Collaborate with data scientists to streamline feature engineering and improve the accessibility of high-value data assets.
  • Designing, building, and maintaining the data architecture needed to support business decisions and data-driven applications.β€―This includes collecting, storing, processing, and analyzing large amounts of data using AWS, Azure, and local tools and services.
  • Develop and enforce data governance standards to ensure consistency, accuracy, and reliability of data across the organization.
  • Ensure data quality, integrity, and completeness in all pipelines by implementing automated validation and monitoring mechanisms.
  • Implement data cataloging, metadata management, and lineage tracking to enhance data discoverability and usability.
  • Work with Engineering to manage and optimize data warehouse and data lake architectures, ensuring efficient storage and retrieval of structured and semi-structured data.
  • Evaluate and integrate emerging cloud-based data technologies to improve performance, scalability, and cost efficiency.
  • Assist with designing and implementing automated tools for collecting and transferring data from multiple source systems to the AWS and Azure cloud platform.
  • Work with DevOps Engineers to integrate any new code into existing pipelines
  • Collaborate with teams in trouble shooting functional and performance issues.
  • Must be a team player to be able to work in an agile environment

AWSPostgreSQLPythonSQLAgileApache AirflowCloud ComputingData AnalysisETLHadoopMySQLData engineeringData scienceREST APISparkCommunication SkillsAnalytical SkillsCI/CDProblem SolvingTerraformAttention to detailOrganizational skillsMicroservicesTeamworkData visualizationData modelingScripting

Posted 1 day ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 108000.0 - 162000.0 USD per year

πŸ” Insurance

🏒 Company: OpenlyπŸ‘₯ 251-500πŸ’° $100,000,000 Series D over 1 year agoLife InsuranceProperty InsuranceInsuranceCommercial InsuranceAuto Insurance

  • 1 to 2 years of data engineering and data management experience.
  • Scripting skills in one or more of the following: Python.
  • Basic understanding and usage of a development and deployment lifecycle, automated code deployments (CI/CD), code repositories, and code management.
  • Experience with Google Cloud data store and data orchestration technologies and concepts.
  • Hands-on experience and understanding of the entire data pipeline architecture: Data replication tools, staging data, data transformation, data movement, and cloud based data platforms.
  • Understanding of a modern next generation data warehouse platform, such as the Lakehouse and multi-data layered warehouse.
  • Proficiency with SQL optimization and development.
  • Ability to understand data architecture and modeling as it relates to business goals and objectives.
  • Ability to gain an understanding of data requirements, translate them into source to target data mappings, and build a working solution.
  • Experience with terraform preferred but not required.
  • Design, create, and maintain data solutions. This includes data pipelines and data structures.
  • Work with data users, data science, and business intelligence personnel, to create data solutions to be used in various projects.
  • Translating concepts to code to enhance our data management frameworks and services to strive towards providing a high quality data product to our data users.
  • Collaborate with our product, operations, and technology teams to develop and deploy new solutions related to data architecture and data pipelines to enable a best-in-class product for our data users.
  • Collaborating with teammates to derive design and solution decisions related to architecture, operations, deployment techniques, technologies, policies, processes, etc.
  • Participate in domain, stand ups, weekly 1:1's, team collaborations, and biweekly retros
  • Assist in educating others on different aspects of data (e.g. data management best practices, data pipelining best practices, SQL tuning)
  • Build and share your knowledge within the data engineer team and with others in the company (e.g. tech all-hands, tech learning hour, domain meetings, code sync meetings, etc.)

DockerPostgreSQLPythonSQLApache AirflowCloud ComputingETLGCPKafkaKubernetesData engineeringGoREST APICI/CDTerraformData modelingScriptingData management

Posted 1 day ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 120000.0 - 150000.0 USD per year

πŸ” Healthcare

🏒 Company: Form HealthπŸ‘₯ 101-250πŸ’° $38,000,000 Series B 9 months agoPersonal HealthMedicalHealth Care

  • 3+ years of experience in full stack software development, with expertise in both frontend and backend technologies
  • Hands-on experience and strong proficiency working with Ruby on Rails and React technologies
  • Experience in healthcare industry a plus, but not required
  • Experience working in an agile environment and comfortable with iterative development cycles
  • Ability to collaborate effectively with cross-functional teams, including product managers, designers, and other engineers.
  • Strong communication skills with the ability to articulate technical concepts to non-technical stakeholders
  • Design, implement, and maintain real-time insurance eligibility verification integrations with various payers.
  • Troubleshoot and resolve eligibility discrepancies, ensuring accurate patient information within our systems.
  • Develop and maintain integrations with Elation EMR, ensuring seamless data flow between patient records, scheduling, and other clinical workflows.
  • Manage and optimize integrations with Candid for billing, focusing on accurate claim generation, submission, and reconciliation.
  • Design, build, and maintain ETL pipelines to facilitate data exchange with our benefits partners.
  • Develop back-end services and clinical tools with Ruby on Rails, Sidekiq, Node (AWS Lambda), GraphQL, and ReactJS.
  • Continuously improve infrastructure (AWS), GraphQL APIs (Apollo Server), data modeling (PostgreSQL, Snowflake), integrations, and other services.
  • Collaborate with cross-functional teams, including engineering, product, and client success, to define integration requirements and ensure alignment.
  • Collaborate with other developers: native mobile, web, back-end/full-stack, and data engineers.
  • Collaborate with stakeholders to support rapid iteration of internal tools to help our business operations scale efficiently and create better experiences and outcomes for our patients.
  • Communicate effectively with external partners to understand their technical needs and address integration challenges.
  • Participate in code reviews, and contribute to technical documentation.
  • Inform technical, product, process, hiring, and architectural decisions (including build vs buy).

AWSBackend DevelopmentGraphQLNode.jsPostgreSQLSQLAgileETLFull Stack DevelopmentGitHTMLCSSJavascriptReact.jsRubyRuby on RailsSnowflakeREST APICI/CDJSONData modelingSoftware EngineeringDebugging

Posted 1 day ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 126650.0 - 179000.0 USD per year

πŸ” Mental Healthcare

🏒 Company: HeadwayπŸ‘₯ 201-500πŸ’° $125,000,000 Series C over 1 year agoMental Health Care

  • 6+ years of full-time experience using data to surfacing key insights and recommendations from data, solving real world problems, and supporting decision making within growth, product, revenue, or operations teams
  • 4+ years of full-time experience at a high-growth technology startup
  • Proficient in SQL, developing and documenting data models, and experience managing a modern BI tool like Metabase or Looker
  • Become a trusted strategic partner to our operations and business teams (growth, revenue, and customer success), and work with them to define analytics requirements, identify important questions, and help answer them with data.
  • Support the operational needs of business stakeholders by delivering high-quality output from data modeling and reverse ETL.
  • Collaborate with Data and Product Analysts, Data Engineers, Data Scientists, and stakeholders across the company to model data in a scalable way and present it in an impactful and digestible way that leads to better business decisions.
  • Tell compelling stories through data, articulating technical information to a non-technical audience

PythonSQLData AnalysisETLSnowflakeData engineeringREST APIData visualizationData modelingData analytics

Posted 1 day ago
Apply
Apply

πŸ“ United States

πŸ’Έ 70000.0 - 105000.0 USD per year

πŸ” Software Development

🏒 Company: VUHL

  • Relevant experience in data engineering or a related discipline.
  • Demonstrated ability to code effectively and a solid understanding of software engineering principles.
  • Experience using SQL or other query language to manage and process data.
  • Experience using Python to build ETL pipelines
  • Experience working with data from various sources and in various formats, including flat files, REST APIs, Excel files, JSON, XML, etc.
  • Experience with Snowflake, SQL Server, or related database technologies.
  • Experience using orchestration tools like Dagster (preferred), Apache Airflow, or similar.
  • Preference for Agile product delivery.
  • Familiarity with GIT, Change Management, and application lifecycle management tools.
  • Ability to influence others without positional control.
  • Create and deliver functional ETL pipelines and other data solutions using core technologies like SQL, Python, Snowflake, Dagster, and SSIS in an agile development environment. Apply sound database design principles and adhere to Clean Code practices.
  • Engage in whole team planning, retrospectives, and communication. Interact with Architects and Product Owners to translate requirements into actionable business logic.
  • Participate in proposing and adopting Engineering standards related to architectural considerations and non-functional requirements such as security, reliability, and stability. Ensure proper management and visibility of borrower data and the life of a loan. Contribute to data governance initiatives.
  • Actively contribute to strengthening the team and culture by taking on various duties as needed, excluding licensed activities.

PythonSQLAgileApache AirflowETLGitSnowflakeData engineeringREST APIJSONData modelingSoftware EngineeringData management

Posted 1 day ago
Apply
Apply

πŸ“ Canada

πŸ’Έ 98400.0 - 137800.0 CAD per year

πŸ” Data Technology

🏒 Company: HootsuiteπŸ‘₯ 1001-5000πŸ’° $50,000,000 Debt Financing almost 7 years agoπŸ«‚ Last layoff about 2 years agoDigital MarketingSocial Media MarketingSocial Media ManagementApps

  • A degree in Computer Science or Engineering, and senior-level experience in developing and maintaining software or an equivalent level of education or work experience, and a track record of substantial contributions to software projects with high business impact.
  • Experience planning and leading a team using Scrum agile methodology ensuring timely delivery and continuous improvement.
  • Experience liaising with various business stakeholders to understand their data requirements and convey the technical solutions.
  • Experience with data warehousing and data modeling best practices.
  • Passionate interest in data engineering and infrastructure; ingestion, storage and compute in relational, NoSQL, and serverless architectures
  • Experience developing data pipelines and integrations for high volume, velocity and variety of data.
  • Experience writing clean code that performs well at scale; ideally experienced with languages like Python, Scala, SQL and shell script.
  • Experience with various types of data stores, query engines and data frameworks, e.g. PostgreSQL, MySQL, S3, Redshift, Presto/Athena, Spark and dbt.
  • Experience working with message queues such as Kafka and Kinesis
  • Experience with ETL and pipeline orchestration such as Airflow, AWS Glue
  • Experience with JIRA in managing sprints and roadmaps
  • Lead development and maintenance of scalable and efficient data pipeline architecture
  • Work within cross-functional teams, including Data Science, Analytics, Software Development, and business units, to deliver data products and services.
  • Collaborate with business stakeholders and translate requirements into scalable data solutions.
  • Monitor and communicate project statuses while mitigating risk and resolving issues.
  • Work closely with the Senior Manager to align team priorities with business objectives.
  • Assess and prioritize the team's work, appropriately delegating to others and encouraging team ownership.
  • Proactively share information, actively solicit feedback, and facilitate communication, within teams and other departments.
  • Design, write, test, and deploy high quality scalable code.
  • Maintain high standards of security, reliability, scalability, performance, and quality in all delivered projects.
  • Contribute to shape our technical roadmap as we scale our services and build our next generation data platform.
  • Build, support and lead a high performance, cohesive team of developers, in close partnership with the Senior Manager, Data Analytics.
  • Participate in the hiring process, with an aim of attracting and hiring the best developers.
  • Facilitate ongoing development conversations with your team to support their learning and career growth.

AWSLeadershipPostgreSQLPythonSoftware DevelopmentSQLAgileApache AirflowETLMySQLSCRUMCross-functional Team LeadershipAlgorithmsApache KafkaData engineeringData StructuresSparkCommunication SkillsAnalytical SkillsCI/CDProblem SolvingMentoringDevOpsWritten communicationCoachingScalaData visualizationTeam managementData modelingData analyticsData management

Posted 1 day ago
Apply
Shown 10 out of 183