Remote UI Designer Jobs

ETL
516 jobs found. to receive daily emails with new job openings that match your preferences.
516 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

πŸ’Έ 64000.0 - 120000.0 USD per year

  • Strong PL/SQL, SQL development skills
  • Proficient in multiple languages used in data engineering such as Python, Java
  • Minimum 3-5 years of experience in Data engineering working with Oracle and MS SQL
  • Experience with data warehousing concepts and technologies including cloud-based services (e.g. Snowflake)
  • Experience with cloud platforms like Azure and knowledge of infrastructure
  • Experience with data orchestration tools (e.g. Azure Data Factory, DataBricks workflows)
  • Understanding of data privacy regulations and best practices
  • Experience working with remote teams
  • Experience working on a team with a CI/CD process
  • Familiarity using tools like Git, Jira
  • Bachelor's degree in Computer Science or Computer Engineering
  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines due to data, queries and processing workflows to ensure efficient and timely data delivery.
  • Implement data quality checks and validations processes to ensure accuracy, completeness and consistency of data delivery.
  • Work with Data Architect and implement best practices for data governance, quality and security.
  • Collaborate with cross-functional teams to identify and address data needs.
  • Ensure technology solutions support the needs of the customer and/or organization.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleSnowflakeAzureData engineeringCI/CDRESTful APIs

Posted 28 minutes ago
Apply
Apply
πŸ”₯ Senior Growth Engineer
Posted about 2 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 140000.0 - 160000.0 USD per year

πŸ” Email Security

🏒 Company: Valimail

  • 4+ years of experience with Python, SQL, or similar languages, with hands-on experience in automation and data flow management in platforms like Snowflake, Segment, Zapier, and Planhat.
  • 2+ years in an operations role supporting Sales, Marketing, Customer Success, or Finance, with a demonstrated ability to align data operations with strategic business goals.
  • Required proficiency in Snowflake, Segment (or other CDP solutions), and BI tools (e.g., Sigma, PowerBI).
  • Familiarity with Zapier, Planhat (or similar tools such as Gainsight or ChurnZero), Salesforce, and Atlassian are a plus.
  • Oversee and optimize data flow across core platforms, including Snowflake, Sigma, Segment, Salesforce, and Planhat, ensuring seamless integration and reliable data access for cross-functional teams.
  • Build strong, collaborative relationships with Marketing, Sales, Finance, Product, and Engineering, facilitating data-driven insights and project alignment across departments.
  • Analyze customer data to derive insights that inform strategic decision-making, delivering actionable reports that support growth objectives.
  • Spearhead automation and optimization projects, implementing solutions that improve data flow and system efficiency.
  • Lead key cross-functional initiatives, ensuring smooth execution and alignment across multiple departments.
  • Provide valuable insights and recommendations to leadership, aligning data-driven findings with broader business objectives.
  • Tackle a wide range of tasks, from technical troubleshooting to strategic planning and cross-departmental collaboration.

Project ManagementPythonSQLBusiness IntelligenceData AnalysisETLSalesforceSnowflakeOperations ManagementAPI testingData engineeringREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingDevOpsCross-functional collaborationData visualizationStrategic thinkingData modelingData management

Posted about 2 hours ago
Apply
Apply

πŸ“ United States

πŸ” Software Development

🏒 Company: Kentro

  • Proven experience as a Power BI Engineer with a track record of translating complex data into actionable insights.
  • Proficient in data modeling, DAX, and data transformation from multiple sources.
  • Skilled in creating visually engaging and interactive reports and dashboards using Power BI with an understanding of user-centric design principles.
  • Experience in data testing and quality control to ensure data accuracy and reliability.
  • Experience in an object-oriented language (e.g. Java, C++, or C#)
  • Experience with scripting (e.g. JavaScript, Python)
  • Create visually engaging and interactive reports and dashboards that fully harness the potential of Power BI. Use various visualization techniques to present data insights in an easily understandable format.
  • Utilize your proficiency in data modeling, DAX, and data transformation from multiple sources to create robust and scalable data models. Ensure data is accurately transformed and ready for analysis.
  • Employ a user-centric design methodology to develop and automate dashboards. Ensure the design is intuitive, user-friendly, and meets the specific needs of leadership.
  • Develop solutions that enable real-time data for informed decision-making. Ensure dashboards are updated with the latest data and provide actionable insights.
  • Work collaboratively as part of a team to ensure accessibility for all users. Develop front-end functions that work seamlessly across browsers, platforms, and devices while meeting stringent accessibility and security requirements.
  • Implement rigorous testing procedures to ensure the accuracy and reliability of the data presented in reports and dashboards. Validate data integrity and continuously monitor data quality to prevent errors and inconsistencies.
  • Ensure that all developed solutions meet accessibility and security requirements. Stay updated with the latest regulations and best practices to ensure compliance.
  • Ability to work independently to achieve successful results with minimal guidance.

SQLETLUser Experience DesignCommunication SkillsAnalytical SkillsCI/CDProblem SolvingRESTful APIsQuality AssuranceData visualizationData modelingScripting

Posted about 2 hours ago
Apply
Apply

πŸ“ United States

πŸ’Έ 110000.0 - 125000.0 USD per year

🏒 Company: KnowBe4πŸ‘₯ 1001-5000πŸ’° $300,000,000 Post-IPO Equity almost 2 years agoComputerSecurityCyber SecurityNetwork SecuritySoftware

  • Bachelor's degree in relevant discipline (MIS, IT business/computer science and/or Accounting), or equivalent experience
  • Minimum 2 years of hands on experience with NetSuite Administration and Implementations
  • NetSuite Administrator level knowledge of roles, permissions, page layouts, custom objects, scripting, workflows, reports, saved searches, dashboards and forms to manage unique business process requirements within NetSuite
  • Experience completing full cycle NetSuite implementations for SaaS-based and/or consumer product business models (ideally both)
  • Demonstrates full understanding of the NetSuite stack and experienced with all NetSuite modules (Order to Case and Fulfillment, Revenue Recognition, Procurement, GL, AP, AR, FA, etc.)
  • Working knowledge of Suite Script, Suite Flow, csv imports, user provisioning, defining roles, analytics and scheduling scripts
  • Experience with Segregation of Duties and SOX controls
  • Experience with Strongpoint and Adaptive Insights, a plus
  • NetSuite Administrator Certification
  • Extensive technical knowledge of Netsuite
  • Strong analytical skills and ability to research, simplify, and resolve complex issues
  • Deep understanding of customizations and business processes as they relate to NetSuite
  • Microsoft Word, Excel, PowerPoint, and Google suite knowledge
  • Basic Data Analysis experience with Excel (v-lookups, pivot tables)
  • A deep understanding of business processes and requirements
  • System implementation and system integration experience
  • Collaborate and partner with internal stakeholders to ensure the design and functionality of the Enterprise Resource Planning (ERP) system is driving value and efficiency for the organization
  • Maintain and support the NetSuite ERP applications and related 3rd party integrations for internal stakeholders
  • Design, implement, and maintain configurations and customizations of NetSuite to meet the organization’s business needs
  • Support end-user requests for new saved searches, reports, key performance indicators (KPIs), and dashboards.
  • Provide internal, on-going system/technical support to users, including training and maintain proper system access for all roles
  • Develop, test, and deploy customizations, custom objects and new NetSuite functionality based on evolving business needs
  • Identify, evaluate, and recommend other key technologies required to support and improve the business process centered on the NetSuite platform
  • Maintain up-to-date knowledge of NetSuite functionality, customization, and integration
  • Support and help manage the change management and release process for NetSuite
  • Provide front line support with regards to all aspects of NetSuite and data integrity
  • Develop, document, and implement policies, procedures, and guidelines to ensure data integrity and change protocols
  • Develop and execute on plans for managing information technology and security for the organization, including activities to be performed in-house or through third-party relationships to best manage the organization's information systems
  • Configure NetSuite to support transaction flow from source applications and systems

SQLBusiness AnalysisData AnalysisETLMicrosoft ExcelAccountingFinancial analysisScriptingChange Management

Posted about 3 hours ago
Apply
Apply

πŸ“ United States

πŸ” Software Development

🏒 Company: ServantπŸ‘₯ 11-50ConsultingAdviceProfessional Services

  • Advanced SQL skills for querying complex datasets, optimizing performance, and building custom data models.
  • Proficiency in Python for data manipulation, statistical analysis, automation, and developing quick ETL workflows.
  • Experience with product analytics tools such as Google Analytics, Amplitude, Mixpanel, or similar platforms.
  • Familiarity with UI/UX analytics, including heatmaps, clickstream analysis, and user flow tracking.
  • Expertise in creating dashboards and reports using Tableau, Power BI, or other visualization tools.
  • Strong foundation in statistical analysis, A/B testing, and experimentation methodologies.
  • Experience with cohort analysis, retention analysis, and growth metrics to support business growth initiatives.
  • Proven ability to work effectively with diverse stakeholders across Product, Marketing, Finance, Operations, and Donor Development.
  • Analyze user behavior, feature adoption, retention trends, and app engagement to identify opportunities for product improvement.
  • Conduct cohort analyses, funnel tracking, and A/B test evaluations to measure the impact of product changes and experiments.
  • Partner with Product Managers to translate business questions into analytical solutions, providing data-driven recommendations to influence product strategy.
  • Collaborate with teams across Marketing, Finance, Operations, Donor Development, and other departments to understand their data needs and deliver actionable insights.
  • Support marketing initiatives with campaign performance analysis, segmentation strategies, and conversion tracking.
  • Provide financial analysis support related to donation trends, revenue forecasts, and operational efficiency metrics.
  • Contribute to donor analytics by tracking engagement, retention, and giving patterns to optimize fundraising strategies.
  • Write complex SQL queries to extract, manipulate, and analyze large datasets from Snowflake, BigQuery, and other data sources.
  • Use Python for data wrangling, statistical analysis, automation of repetitive tasks, and developing quick ETL pipelines as needed.
  • Independently identify and implement solutions to data challenges, minimizing dependencies on Data Engineers.
  • Design and build interactive dashboards in Tableau, Power BI, or custom solutions to visualize key KPIs such as MAUs, donation trends, campaign performance, and operational metrics.
  • Automate reporting processes to improve efficiency, reduce manual errors, and ensure consistent access to up-to-date data.
  • Continuously improve dashboards based on user feedback and evolving business needs, ensuring insights are both accessible and impactful.
  • Leverage product analytics tools (e.g., Google Analytics, Amplitude, Mixpanel) to analyze user interactions and behaviors within digital products.
  • Conduct heatmap analyses, clickstream tracking, and user journey mapping to identify friction points and areas for optimization.
  • Provide data-driven recommendations to Product and Design teams to enhance user experiences and improve conversion rates.
  • Design, implement, and analyze A/B tests to evaluate new product features, marketing campaigns, and operational changes.
  • Develop statistical models to forecast user growth, donor retention, and revenue trends based on historical data and predictive analytics.
  • Recommend growth strategies backed by data-driven experimentation and robust analysis.
  • Ensure data accuracy, consistency, and reliability across all reports and analyses by implementing data validation checks and monitoring data pipelines.
  • Partner with the Data Architect and Data Governance Lead to uphold data standards and best practices across the organization.
  • Proactively identify and resolve data discrepancies, ensuring high data integrity in all outputs.
  • Stay current on emerging trends in product analytics, data science, and business intelligence tools.
  • Experiment with new technologies, programming languages, and analytical methods to continuously enhance data capabilities.
  • Share knowledge and best practices with the broader ACoE team, contributing to a culture of continuous improvement and data literacy across the organization.

PythonSQLBusiness IntelligenceData AnalysisETLNumpySnowflakeGoogle AnalyticsAmplitude AnalyticsTableauProduct AnalyticsREST APIPandasCommunication SkillsAnalytical SkillsCI/CDProblem SolvingWritten communicationReportingJSONCross-functional collaborationData visualizationFinancial analysisData modelingA/B testing

Posted about 5 hours ago
Apply
Apply

πŸ“ Germany

πŸ” AI and data analytics consulting

🏒 Company: Unit8 SA

  • MSc level in the field of Computer Science, Machine Learning, Applied Statistics, Mathematics, or equivalent work experience.
  • Proficient software engineer who has experience in applying a blend of software engineering, machine learning, and statistical methods to solve real-world business problems
  • Proficient in one of the following languages: Python, Scala, Java.
  • Experience with cloud technologies is a strong plus.
  • Work with our customers to understand their challenges, design and implement solutions.
  • Closely collaborate with other data scientists, software engineers and business stakeholders.
  • Evaluate, compare and present results to technical and non-technical audience.
  • Contribute to the implementation and engineering of systems at different scales: from small proof-of-concepts to larger end-to-end data systems.
  • Implement best practices in CI/CD

AWSDockerPythonSQLCloud ComputingData AnalysisETLJavaKubernetesMachine LearningAlgorithmsData engineeringData scienceSparkCI/CDRESTful APIsScalaSoftware Engineering

Posted about 5 hours ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Audit and advisory industry

🏒 Company: FieldguideπŸ‘₯ 101-250πŸ’° $30,000,000 Series B 12 months agoArtificial Intelligence (AI)Document Management

  • 3-4 years of experience in software engineering, DevOps, or related field with a focus on ML systems
  • Experience with ML frameworks
  • Experience with cloud platforms, preferably AWS
  • Experience with container runtime architectures, preferably Kubernetes
  • Proficiency with at least one programming language, preferably Python or Typescript
  • Familiarity with CI/CD practices and tools
  • Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation
  • Strong understanding of distributed systems and microservices architecture
  • Ability to work in a fast-paced, changing startup environment
  • Design and implement infrastructure for ML model management, including training, deployment, and monitoring
  • Build and maintain platforms for running ML algorithms at scale
  • Develop systems for A/B testing, performance monitoring, and continuous model training
  • Create and manage ETL infrastructure to support ML workflows
  • Implement best practices for MLOps, including version control for models and datasets
  • Collaborate with ML Engineers to optimize model performance and resource utilization
  • Ensure the scalability, reliability, and security of ML systems
  • Stay current with the latest advancements in MLOps and cloud technologies
  • Contribute to the development of internal tools and frameworks to improve ML workflow efficiency
  • Be an essential technical contributor at a Series B-stage company as it scales

AWSDockerPythonCloud ComputingETLKubernetesMachine LearningCI/CDTerraformMicroservicesSoftware Engineering

Posted about 5 hours ago
Apply
Apply

πŸ“ USA

🧭 Full-Time

πŸ” Audit and advisory

🏒 Company: FieldguideπŸ‘₯ 101-250πŸ’° $30,000,000 Series B 12 months agoArtificial Intelligence (AI)Document Management

  • 4-5 years of experience in applied machine learning or related field
  • Strong proficiency in Python and its ML/data science libraries
  • Extensive experience with NLP techniques and generative AI technologies
  • Experience with LLMs and both text-to-text and text-to-image generative models
  • Proficiency in working with large datasets and creating ETL processes
  • Experience with version control systems (e.g., Git) and CI/CD practices
  • Ability to work in a fast-paced, changing startup environment
  • Collaborate with stakeholders to identify and map business problems to ML solutions
  • Design, develop, and implement ML models with a focus on NLP and generative AI applications
  • Curate, clean, and prepare data for model development and training
  • Create and maintain ETL jobs for data processing
  • Conduct rapid prototyping of ML solutions to quickly iterate on ideas
  • Stay current with the latest advancements in ML, particularly in NLP and generative AI
  • Collaborate with the platform engineering team to integrate ML solutions into the overall product architecture
  • Implement data flywheels to continuously improve ML features through increased usage
  • Define and implement ML performance metrics
  • Contribute to the product roadmap with ML-driven feature ideas
  • Be an essential technical contributor at a Series B-stage company as it scales

AWSPythonSQLETLGitMachine LearningNumpyData sciencePandasTensorflowCI/CDA/B testing

Posted about 5 hours ago
Apply
Apply
πŸ”₯ Senior Data Analyst
Posted about 7 hours ago

πŸ“ Ireland, United States (AZ, CA, CO, CT, FL, GA, IL, IN, IA, KS, MA, MI, MN, MO, NE, NJ, NY, NC, PA, SD, TN, TX, UT, WA, WI)

πŸ” Data Analysis

🏒 Company: SojernπŸ‘₯ 501-1000πŸ’° $9,842,000 over 1 year agoπŸ«‚ Last layoff over 1 year agoDigital MarketingAdvertising PlatformsSaaSTravel

  • 3+ years of experience in a data-driven role, preferably in analytics, data engineering, or business intelligence.
  • Strong proficiency in SQL (writing complex queries, working with relational databases).
  • Exposure to building ETL pipelines
  • Familiarity with BI tools (Tableau, Power BI, Looker, etc.) and the ability to create meaningful data visualizations.
  • Ability to work in a fast-paced environment, manage multiple priorities, and communicate insights effectively to both technical and non-technical stakeholders.
  • Work closely with Sales, Finance, Product, and Marketing teams to understand business needs and translate them into data solutions.
  • Analyze large datasets to identify trends, performance metrics, and actionable insights that support strategic business initiatives.
  • Develop detailed reports and dashboards using Tableau to help stakeholders make data-driven decisions.
  • Collaborate with data scientists and operations analysts to support testing and analysis on campaigns.
  • Build, optimize, and maintain ETL pipelines to process and transform raw data into structured, meaningful datasets.
  • Work with SQL, Python, and BigQuery to extract, clean, and manipulate data from multiple sources.
  • Ensure data quality, integrity, and consistency across different data sources.

PythonSQLApache AirflowBusiness IntelligenceData AnalysisETLJenkinsTableauAlgorithmsData engineeringData StructuresREST APIReportingData entryData visualizationData modelingData analyticsData management

Posted about 7 hours ago
Apply
Apply
πŸ”₯ Principal Data Engineer
Posted about 7 hours ago

πŸ“ United Kingdom

🧭 Full-Time

πŸ” Insurance

🏒 Company: external

  • Extensive experience of designing and building end to end data solutions (10 years +)
  • Experience of carrying out data engineering design and build activities using agile working practices
  • Experience of Databricks solutions, Databricks administration and pyspark
  • Data Factory/Synapse Workspace – for building data pipelines or synapse analytics pipelines
  • Data Lake – Delta Lake design pattern implementation experience in Azure Data Lake Gen2
  • Synapse Warehouse/Analytics – Experience in Synapse data mappings, external tables, schema creation from SSMS, knowledge on how Synapse pool works behind the scenes
  • Azure Active Directory – for Managed identities creation and usage or for generating service principles for authentication and authorization
  • Version Control – Experience in building Data Ops i.e., CICD pipelines in Azure DevOps with managed identity
  • Unit Testing – Experience in writing unit tests for data pipelines
  • Data Architecture – Knowledge or experience in implementing, Kimball style Data Warehouse
  • Data Quality – Experience in applying Data Quality rules within Azure Data Flow Activities
  • Data Transformation – Extensive hands on with Azure Data Flow Activities for Cleansing, transforming, validation and quality checks
  • Azure Cloud – Knowledge and confidence in effective communication on Azure Cloud Subscriptions
  • Create or guide the low-level design of data solutions
  • Be responsible for the quality of the overall data platform(s)
  • Be responsible for coding standards, low level design and ingestion patterns for the data platform(s)
  • Develop high complexity, secure, governed, high quality, efficient data pipelines
  • Set the standards and ensure that data is cleansed, mapped, transformed and optimised for storage
  • Design and build of data observability and data quality by design into all Data pipelines
  • Build solutions that pipe transform data into data lake storage areas, physical database models and reporting structures across data lake, data warehouse, business intelligence systems and analytics applications
  • Build physical data models that are appropriately designed to meet business needs and optimise storage requirements
  • Carry out unit testing of own code, peer testing of others code
  • Ensure that effective, and appropriate documentation that brings transparency and understandability are in place for all content on the data platform(s)
  • Coach and mentor Senior Data Engineers, Data engineers & Associate Data Engineers
  • Create high complexity BI solutions

PythonSQLAgileBusiness IntelligenceETLMicrosoft Power BIApache KafkaAzureData engineeringCI/CDData modelingData analytics

Posted about 7 hours ago
Apply
Shown 10 out of 516

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.