Apply

Senior Software Engineer - Data

Posted 2 months agoViewed

View full description

๐Ÿ’Ž Seniority level: Senior, Extensive experience

๐Ÿ’ธ Salary: 153000.0 - 181000.0 USD per year

๐Ÿ” Industry: Creator marketing platform

๐Ÿ—ฃ๏ธ Languages: English

โณ Experience: Extensive experience

Requirements:
  • Extensive experience with Rails
  • Experience with React
  • Experience working with large web application datasets to improve integrity and quality
  • Previous experience writing code that is accessible, scalable, maintainable, and performant
  • Strong code reading comprehension skills
  • Ability to articulate and synthesize problems across written, visual, and auditory formats to effectively communicate and bring clarity to any challenges
  • Ability to deliver feedback with empathy and come up with solutions
  • Be autonomous and self-motivated to work effectively in a 100% remote company
  • Have enthusiasm for and belief in the company's mission, vision, and values
Responsibilities:
  • Own solving problems across the full stack in collaboration with your team
  • Actively participate in driving the technical direction of our codebase
  • Collaborate with team members across the organization
  • Proactive communication with product stakeholders
  • Staying up to date with current technical best practices
Apply

Related Jobs

Apply

๐Ÿ“ North America, Europe

๐Ÿงญ Full-Time

๐Ÿ’ธ 140000.0 - 170000.0 CAD / USD per year

๐Ÿ” Blockchain technology

  • Proven extensive experience in software engineering and distributed systems.
  • Ability to write production-ready applications in Go.
  • Ability to reason about tradeoffs between different engineering approaches.
  • Familiar with data storage solutions (SQL/noSQL databases).
  • Familiar with containerized infrastructure (Docker, Kubernetes).
  • Results-oriented individual with a high EQ and attention to detail.
  • Architect own solutions factoring in maintenance, scalability, and security.
  • Familiar with team processes based on agile methodology.
  • Collaborative approach to sharing ideas and finding innovative solutions.
  • Improve overall productivity through technical leadership and mentorship.

  • Create high performance indexing software to ingest data from blockchains or other sources in Go.
  • Full responsibility for technical architecting and team processes.
  • Work closely with other business units to prioritize deliverables and set timelines.
  • Make technical decisions and explain them to team members for buy-in.
  • Process big data collections and design fast-read data storages.
  • Design and implement high availability APIs for large blockchain datasets.
  • Design next generation data pipelines.
  • Mentor junior team members.

DockerSQLBlockchainKubernetesData engineeringGoNosql

Posted 13 days ago
Apply
Apply

๐Ÿ“ Canada, US, NOT STATED

๐Ÿงญ Full-Time

๐Ÿ’ธ 140000.0 - 170000.0 CAD per year

๐Ÿ” Blockchain technology, Web3

๐Ÿข Company: Figment๐Ÿ‘ฅ 11-50HospitalityTravel AccommodationsArt

  • Proven extensive experience in software engineering and distributed systems.
  • Ability to write production-ready applications in Go.
  • Ability to reason about tradeoffs between different engineering approaches.
  • Familiarity with data storage solutions such as SQL and NoSQL databases.
  • Experience with containerized infrastructure, including Docker and Kubernetes.
  • High attention to detail with a results-oriented approach.
  • Ability to architect solutions considering maintenance, scalability, and security.
  • Familiar with agile methodologies and team processes.
  • Collaborative approach for sharing ideas and finding innovative solutions.

  • Create high performance indexing software to ingest data from blockchains or other sources in Go.
  • Full responsibility for technical architecting and team processes.
  • Work closely with various business units to prioritize deliverables and set timelines.
  • Make technical decisions on different engineering approaches and achieve buy-in from team members.
  • Process big data collections and design fast-read data storages.
  • Design and implement highly available APIs for large blockchain datasets.
  • Design next generation data pipelines.
  • Serve as a mentor for junior team members.

DockerSQLBlockchainKubernetesGoNosql

Posted 13 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 165000.0 - 210000.0 USD per year

๐Ÿ” Software Development / Data Platform

๐Ÿข Company: Temporal Technologies๐Ÿ‘ฅ 51-200๐Ÿ’ฐ $75,000,000 Series B almost 2 years agoSoftware Development

  • At minimum 7 years of industry (post graduate) experience across the data stack (ingest, storage, compute, data modeling, visualization).
  • 0-1 experience designing and architecting a data platform for a SaaS business.
  • Experience owning key components of SLA bearing production data platform.
  • Experience building out data lake architectures.
  • Deep expertise within at least one of the major cloud providers (AWS, GCP, Azure).
  • Experience working with a wide range of data sources (APIs, logs, event stores, etc.).
  • Extreme proficiency in Python and SQL, additional languages a plus.
  • Experience with multiple data processing and query engines (Spark, Presto/Trino, Athena, BigQuery, etc.).
  • Significant experience with both object stores and relational databases such as S3 and Redshift.
  • Ability to quickly gain proficiency in new tools and technologies.
  • Strong desire to continue to learn and experiment.
  • Strong communicator and collaborator with business impact focus.

  • Designing and architecting the data platform for scalability and near-term value.
  • Building data pipelines for event-driven and batch workloads.
  • Implementing data quality checks in existing pipelines.
  • Modeling data for OLAP purposes.
  • Evaluating and recommending data processing tools.
  • Owning projects through collaboration with engineers and business stakeholders.
  • Creating guidelines for data access and ingestion.
  • Monitoring performance of data operations.
  • Contributing to the data platform roadmap.
  • Creating actionable dashboards.
  • Training stakeholders on new data products.
  • Mentoring junior engineers.

AWSPythonSQLETLGCPData engineeringSparkData modeling

Posted 16 days ago
Apply
Apply

๐Ÿ“ San Francisco Bay Area, Seattle, India, UK

๐Ÿ’ธ 150000.0 - 180000.0 USD per year

๐Ÿ” B2B technology

  • Four-year degree in Computer Science, or related field OR equivalent experience.
  • Understanding of data strategies, data analysis, and data model design.
  • Experience designing and building low latency analytics APIs.
  • Proficiency in at least one JVM language (Java, Scala, Kotlin, etc.).
  • Familiarity with the Spark Ecosystem (YARN, Executors, Livy, etc.).
  • Data orchestration frameworks, particularly Airflow or similar.
  • Experience with columnar data stores, particularly Parquet and StarRocks.
  • Strong SDLC principles (CI/CD, Unit Testing, git, etc.).
  • General understanding of AWS EMR, EC2, S3.

  • Design and build the next generation of Demandbaseโ€™s Unified Data Platform.
  • Develop data pipelines for ingestion, normalization, and analysis.
  • Integrate 3rd party and open source tools into the data platform.
  • Build DAGs in Airflow for orchestration and monitoring of data pipelines.

AWSApache AirflowJavaKafkaKotlinSparkTerraformScalaData modeling

Posted 18 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 190800.0 - 267100.0 USD per year

๐Ÿ” Technology / Social Media

๐Ÿข Company: Reddit๐Ÿ‘ฅ 1001-5000๐Ÿ’ฐ $410,000,000 Series F over 3 years ago๐Ÿซ‚ Last layoff over 1 year agoNewsContentSocial NetworkSocial Media

  • 4+ years of software engineering experience in a production setting writing clean, maintainable, and well-tested code.
  • Proficient in object-oriented programming languages like Python, Scala, and experienced in Go.
  • Expertise in SQL languages such as BigQuery, SparkSQL, or Postgres.
  • Experience in designing and implementing large-scale systems and proactive leadership.
  • Familiarity with cloud services, GCP products, terraform, airflow, Kubernetes, CI/CD, and modern cloud infrastructure.
  • Excellent communication skills for collaboration within a service-oriented team and broader organizational context.

  • Collaborate effectively with a team of proficient software engineers to develop and maintain the fundamental platform that powers Reddit's data warehouse infrastructure.
  • Engage in the complete data lifecycle at Reddit, participating in the development process and working with extensive datasets.
  • Design, build, and deliver end-to-end data solutions to improve the reliability, scalability, latency, and efficiency of Redditโ€™s Data Platform.
  • Implement automation for key elements of the development process, including data quality, managing alerts, and handling critical infrastructure operations.
  • Collaborate and share on-call responsibilities, including incident management, with the Data Warehouse team.
  • Guide and support fellow engineers by mentoring and contributing to knowledge sharing through training sessions and documentation.

PythonSQLApache AirflowGCPKubernetesGoCI/CDTerraformScala

Posted 29 days ago
Apply
Apply

๐Ÿ“ Canada

๐Ÿงญ Full-Time

๐Ÿ’ธ 120275 - 155650 CAD per year

๐Ÿ” Internet of Things (IoT)

๐Ÿข Company: Samsara๐Ÿ‘ฅ 1001-5000๐Ÿ’ฐ Secondary Market over 4 years ago๐Ÿซ‚ Last layoff over 4 years agoCloud Data ServicesBusiness IntelligenceInternet of ThingsSaaSSoftware

  • Bachelor's Degree in Computer Science/Engineering or equivalent practical experience.
  • 4+ years experience building/maintaining a large scale production-grade data platform.
  • Strong programming and software engineering skills, including Python, Go, Scala, or SQL.
  • 2+ years experience working with Spark.
  • Experience managing data orchestration systems (e.g. Airflow, Flyte, Prefect, Dagster).
  • AWS knowledge and expertise (S3, Lambda, SQS, Kinesis).

  • Develop software to reliably ingest vast amounts of data into our data lake.
  • Explore new infrastructure needed to support the growing needs of our data platform.
  • Design, scope, and build libraries and data management tooling for effective use.
  • Expand ability to stream data for near real-time access.
  • Ensure uptime, reliability, and monitoring of the data platform.
  • Implement new tools for easier data leverage.
  • Uplevel team members on data best practices and tools.

AWSPythonSQLAirflowGoSparkCollaborationTerraformSoftware Engineering

Posted about 1 month ago
Apply
Apply

๐Ÿ“ USA, India, UK

๐Ÿงญ Full-Time

๐Ÿ” B2B technology

๐Ÿข Company: Demandbase๐Ÿ‘ฅ 501-1000๐Ÿ’ฐ $175,000,000 Debt Financing almost 2 years agoSales AutomationAdvertisingBig DataSaaSAnalyticsB2BMarketingMarketing AutomationSoftware

  • Bachelorโ€™s or masterโ€™s degree in computer science, Mathematics, Statistics from a top engineering institution.
  • 4+ years of Data Engineering experience in building enterprise data/analytics solutions.
  • Strong practical experience in Databases, Advanced SQL, and Python/R.
  • Good to have experience in designing/implementing ETL data pipelines using open-source platforms.
  • Familiarity with big data technologies like Hive, Red Shift, Hbase, and Apache Spark.

  • Design, Model and Implement data analysis and analytics solutions.
  • Be a hands-on individual contributor for data projects in high-level design, analysis, experiments, data architecture, and data modeling.
  • Support ETL pipeline modules by designing state-of-the-art transformations, data cleaning, matching, reports dashboards, and statistical analysis.
  • Work closely with cross-functional teams in an Agile environment.

PythonSQLAgileData AnalysisETLData engineeringCommunication SkillsProblem SolvingData visualizationData modeling

Posted about 1 month ago
Apply
Apply

๐Ÿ“ United States

๐Ÿ” Life sciences

  • Applicants must have the unrestricted right to work in the United States.
  • Veeva will not provide sponsorship at this time.

  • Spearhead the development of new architecture for the Data platform from the ground up.
  • Design and build a resilient, scalable cloud-based platform along with its accompanying tools.
  • Empower Opendata teams to efficiently create and distribute valuable data assets.
  • Exercise end-to-end ownership for the project.

Backend DevelopmentLeadershipSoftware DevelopmentCross-functional Team LeadershipCommunication SkillsAnalytical SkillsCollaboration

Posted about 2 months ago
Apply
Apply

๐Ÿ“ USA

๐Ÿงญ Full-Time

๐Ÿ’ธ 169000 - 240000 USD per year

๐Ÿ” Financial services

  • 5+ years of industry experience in building large scale production systems.
  • Experience building and owning large-scale stream processing systems.
  • Experience building and operating robust and highly available infrastructure.
  • Working knowledge of Relational and NoSQL databases.
  • Experience working with Data Warehouse solutions.
  • Experience with industry standard stream processing frameworks like Spark, Samza, Flink, Beam etc.
  • Experience leading technical projects and mentoring junior engineers.
  • Exceptionally collaborative with a history of delivering complex technical projects and working closely with stakeholders.
  • This position requires either equivalent practical experience or a Bachelorโ€™s degree in a related field.

  • Help support the Data Platform that forms the backbone for several thousands of offline workloads at Affirm.
  • Design and build data infrastructure systems, services, and tools to handle new Affirm products and business requirements that securely scale over millions of users and their transactions.
  • Build frameworks and services which will be used by other engineering teams at Affirm to manage billions of dollars in loans and power customer experiences.
  • Improve the reliability and efficiency of the Data Platform at scale and high reliability.
  • Engage other teams at Affirm about their use of the Data platform to ensure we are always building the right thing.

Backend DevelopmentLeadershipSoftware DevelopmentSQLData AnalysisElasticSearchKafkaCross-functional Team LeadershipApache KafkaSparkCollaboration

Posted 2 months ago
Apply
Apply

๐Ÿงญ Full-Time

๐Ÿ” Biomedical research and development

  • A degree in Computer Science/Engineering or a related field within science.
  • 5+ years experience working as a software developer in the industry.
  • Proficient with Python and SQL.
  • Experience with Event-driven architecture with Pub/Sub.
  • A track record in building high-quality, maintainable code.

  • Collaborate with Machine Learning, Fullstack engineers, and Science to solve complex document mining challenges.
  • Help capture and model additional scientific experiments.
  • Define and apply best practices in a cloud-based environment.
  • Lead or consult the authoring of engineering design proposals.
  • Proactively identify new opportunities and advocate for project improvements.
  • Respond to operational issues with urgency and own resolutions.
  • Challenge the status quo and propose newer technologies or methods.
  • Scale data pipelines for reliable data transfer from research to platform.
Posted 3 months ago
Apply

Related Articles

Posted 4 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 5 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 5 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 5 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 5 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.