PeakMetrics

👥 11-50💰 $4,200,000 Seed about 1 year agoMachine LearningCyber SecurityMarketing AutomationNatural Language ProcessingSoftware💼 Private Company
Website LinkedIn Email Facebook Twitter

PeakMetrics is a narrative intelligence platform using cutting-edge machine learning to help enterprises and government agencies combat online narrative threats. We analyze over 1.5 million media sources and 10 social media platforms in real-time to identify media manipulation and adversarial attacks before they cause damage. Our platform empowers organizations to understand how narratives spread, discern source credibility, and quantify risk. Our technology stack includes Python, React, and TypeScript, with a focus on backend architecture and frontend development. We leverage cloud platforms like Google Cloud, and embrace DevOps practices. Our engineers utilize technologies like DNSSEC, Google Cloud, and data pipelines. PeakMetrics has a remote-first culture, fostering collaboration and innovation. We are a seed-stage company with a recent seed round of $3 million. Join us in our mission to protect clients from the risks of social media manipulation. We're a team of about 11-50 employees, committed to a collaborative, high-trust environment. We're looking for skilled engineers who are ready to take on meaningful challenges and help shape the future of media intelligence. Our values center on empowering you to grow in your skills while working alongside a talented and mission-driven group. We also offer a competitive salary and equity, alongside benefits that support a healthy work-life balance, including flexible work schedules and unlimited time off.

Related companies:

Jobs at this company:

Apply

📍 United Kingdom

🧭 Full-Time

🔍 Software Development

  • 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
  • Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
  • Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
  • Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
  • Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
  • Track record of successfully managing and scaling high-performing technical teams.
  • Experience with big data technologies (e.g., Kafka).
  • Familiarity with ML/AI infrastructure and frameworks.
  • Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
  • You have maintained data management systems and have built new data pipelines from scratch.
  • You are comfortable automating data flows with resilient code using Python.
  • Experience with Dagster or other data orchestration platforms such as Airflow.
  • You have strong database architecture design and management knowledge in both structured and unstructured data.
  • Advanced knowledge of Elasticsearch or OpenSearch, including configuration, operation, and using for search
  • Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
  • You are able to define and communicate data architecture requirements, keeping current with data management best practices.
  • Exceptional leadership, communication, and decision-making abilities.
  • Strong analytical mindset with a solution-oriented approach.
  • Ability to balance strategic vision with tactical execution.
  • Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
  • Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
  • Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
  • Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
  • Foster a culture of innovation, accountability, and collaboration within the team.
  • Establish best practices for performance management, career development, and skills growth.
  • Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
  • Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Drive the implementation of best practices in data governance, quality, and security.
  • Ensure the availability, reliability, and performance of data systems
  • Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
  • Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
  • Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
  • Present project updates, performance metrics, and strategic initiatives to leadership.

AWSDockerLeadershipPostgreSQLPythonSQLBashCloud ComputingElasticSearchETLKafkaKubernetesMachine LearningAirflowData engineeringREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesDevOpsTerraformData visualizationTeam managementAnsibleData modelingData management

Posted 8 days ago
Apply
Apply

📍 Canada

🧭 Full-Time

🔍 Software Development

  • 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
  • Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
  • Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
  • Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
  • Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
  • Track record of successfully managing and scaling high-performing technical teams.
  • Experience with big data technologies (e.g., Kafka).
  • Familiarity with ML/AI infrastructure and frameworks.
  • Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
  • Certifications in cloud platforms or DevOps practices are a plus.
  • You have maintained data management systems and have built new data pipelines from scratch.
  • You are comfortable automating data flows with resilient code using Python.
  • Experience with Dagster or other data orchestration platforms such as Airflow.
  • You have strong database architecture design and management knowledge in both structured and unstructured data.
  • Advanced knowledge of Elasticsearch or OpenSearch, including configuration, operation, and using for search
  • Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
  • Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
  • Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
  • Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
  • Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
  • Foster a culture of innovation, accountability, and collaboration within the team.
  • Establish best practices for performance management, career development, and skills growth.
  • Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
  • Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Drive the implementation of best practices in data governance, quality, and security.
  • Ensure the availability, reliability, and performance of data systems
  • Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
  • Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
  • Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
  • Present project updates, performance metrics, and strategic initiatives to leadership.

AWSDockerPostgreSQLPythonSQLBashCloud ComputingElasticSearchETLJenkinsKafkaKubernetesMachine LearningAirflowData engineeringCI/CDDevOpsTerraformAnsibleData modelingData management

Posted 8 days ago
Apply
Apply

📍 United States

🧭 Fulltime

🔍 SaaS, AI

  • 5-7+ years of experience in SaaS enterprise sales, with at least 3 years in a leadership role (e.g., Head of Sales, Director of Sales).
  • Proven success in scaling and driving revenue growth in a fast-paced, high-growth enviroment.
  • Deep understanding of sales metrics and sales methods (ARR/MRR, churn, CAC, LTV).
  • Demonstrated ability to sell complex B2B solutions to mid-market and enterprise customers.
  • Experience in SaaS, AI, Data Analytics growth stage startup
  • Enjoys understanding the technology and underlying product or has prior technical or solutions engineering experience.
  • Familiarity with product-led growth strategies and sales automation tools.
  • Strong leadership and people management skills.
  • Data-driven decision-maker with expertise in CRM tools and sales analytics.
  • Exceptional communication, negotiation, and relationship-building skills.
  • Strategic thinker with the ability to execute tactically and pivot as needed.
  • Entrepreneurial mindset with a passion for innovation and growth.
  • Develop and implement a scalable sales strategy to drive predictable and sustainable revenue growth.
  • Establish sales goals, KPIs, and metrics that align with company objectives (e.g., ARR, MRR, churn, and LTV).
  • Identify new market opportunities, target segments, and growth channels.
  • Refine and optimize the sales process to improve efficiency and shorten sales cycles.
  • Own the sales pipeline, forecast revenue, and ensure accurate reporting.
  • Lead efforts to acquire, retain, and expand customers across SMB, mid-market, or enterprise verticals (depending on company focus).
  • Drive customer upselling, cross-selling, and renewal strategies to maximize LTV.
  • Leverage CRM tools (HubSpot etc.) to track key metrics, manage the sales pipeline, and generate actionable insights.
  • Monitor performance metrics such as conversion rates, CAC, and retention to inform strategy.
  • Use analytics to identify gaps and opportunities, iterating on strategies as needed.
  • Build strong relationships with key customers, acting as a trusted advisor and advocate.
  • Collaborate with marketing to align on lead generation and nurturing campaigns.
  • Gather market insights and customer feedback to inform product development and competitive positioning.
  • Partner with product teams to ensure customer feedback influences the product roadmap.
  • Work with customer success teams to ensure smooth onboarding and exceptional customer experiences.
  • Collaborate with finance and operations to ensure alignment on pricing, discounts, and deal structures.

LeadershipArtificial IntelligenceData AnalysisPeople ManagementSalesforceCross-functional Team LeadershipProduct DevelopmentCommunication SkillsRESTful APIsNegotiationAccount ManagementSales experienceMarket ResearchLead GenerationStrategic thinkingCRMCustomer SuccessSaaS

Posted 8 days ago
Apply
Apply

📍 United Kingdom

🧭 Full-Time

🔍 Engineering

  • 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
  • Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
  • Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
  • Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
  • Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
  • Track record of successfully managing and scaling high-performing technical teams.
  • Experience with big data technologies (e.g., Kafka).
  • Familiarity with ML/AI infrastructure and frameworks.
  • Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
  • You have maintained data management systems and have built new data pipelines from scratch.
  • You are comfortable automating data flows with resilient code using Python.
  • Experience with Dagster or other data orchestration platforms such as Airflow.
  • You have strong database architecture design and management knowledge in both structured and unstructured data.
  • Advanced knowledge of Elasticsearch or OpenSearch, including configuration, operation, and using for search
  • Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
  • You are able to define and communicate data architecture requirements, keeping current with data management best practices.
  • Exceptional leadership, communication, and decision-making abilities.
  • Strong analytical mindset with a solution-oriented approach.
  • Ability to balance strategic vision with tactical execution.
  • Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
  • Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
  • Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
  • Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
  • Foster a culture of innovation, accountability, and collaboration within the team.
  • Establish best practices for performance management, career development, and skills growth.
  • Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
  • Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Drive the implementation of best practices in data governance, quality, and security.
  • Ensure the availability, reliability, and performance of data systems
  • Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
  • Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
  • Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
  • Present project updates, performance metrics, and strategic initiatives to leadership.

AWSDockerLeadershipPostgreSQLPythonSQLBashCloud ComputingElasticSearchETLJenkinsKafkaKubernetesMachine LearningAirflowData engineeringREST APINosqlCI/CDDevOpsTerraformTeam managementAnsibleScriptingData management

Posted 8 days ago
Apply
Apply

📍 Canada

🧭 Full-Time

🔍 Software Development

  • 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
  • Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
  • Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
  • Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
  • Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
  • Track record of successfully managing and scaling high-performing technical teams.
  • Experience with big data technologies (e.g., Kafka).
  • Familiarity with ML/AI infrastructure and frameworks.
  • Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
  • You have maintained data management systems and have built new data pipelines from scratch.
  • You are comfortable automating data flows with resilient code using Python.
  • Experience with Dagster or other data orchestration platforms such as Airflow.
  • You have strong database architecture design and management knowledge in both structured and unstructured data.
  • Advanced knowledge of Elasticsearch or OpenSearch, including configuration, operation, and using for search
  • Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
  • Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
  • Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
  • Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
  • Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
  • Foster a culture of innovation, accountability, and collaboration within the team.
  • Establish best practices for performance management, career development, and skills growth.
  • Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
  • Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Drive the implementation of best practices in data governance, quality, and security.
  • Ensure the availability, reliability, and performance of data systems
  • Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
  • Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
  • Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
  • Present project updates, performance metrics, and strategic initiatives to leadership.

AWSDockerLeadershipPostgreSQLPythonSQLBashCloud ComputingElasticSearchETLJenkinsKafkaKubernetesMachine LearningAirflowAlgorithmsData engineeringREST APINosqlCI/CDDevOpsTerraformData visualizationTeam managementData modelingScriptingData management

Posted 8 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Government

  • 4+ years working in the DoD
  • 2+ years working in a BD role with a commercial technology company selling to DoD and other federal agencies
  • Proven track record of winning and maintaining six figure and larger deals with Defense and federal agencies
  • Experience navigating DoD budgeting and acquisition processes to turn leads into recurring revenue
  • Experience working with OIE, OSINT, and/or Media Monitoring teams
  • Skilled at building relationships across stakeholders while shaping demand signals for PAI/OSINT capabilities
  • Excellent written and verbal communication skills
  • Drive business development efforts to engage with potential customers and close federal contract opportunities
  • Propel new and existing opportunities through the sales pipeline
  • Maintain relationships with key stakeholders in existing accounts
  • Maintain a CRM to keep track of ongoing conversations and relationships
  • Develop a strong understanding of PeakMetrics’ capabilities and how they can provide value to customers
  • Develop proposals for relevant solicitations, RFPs, and other contract opportunities
  • Travel to support on-site customer meetings and conferences

Business DevelopmentSalesforceCommunication SkillsCI/CDWritten communicationAccount ManagementNegotiation skillsBudgetingRelationship managementSales experienceStakeholder managementCRM

Posted 8 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

  • 7+ years of experience in full-stack software development with leadership responsibilities.
  • Strong backend experience (Python, Node.js, or similar) and frontend expertise (React, TypeScript, etc.).
  • Experience in scaling architectures, optimizing performance, and managing infrastructure.
  • Prior experience in leading or mentoring engineers in a startup or fast-moving environment.
  • Experience with AI/ML, NLP, or media intelligence platforms.
  • Familiarity with cloud platforms (AWS, GCP) and DevOps practices.
  • Hands-on experience with data pipelines and analytics-heavy applications.
  • Act as a mentor, unblock engineers, and provide technical guidance when needed.
  • Help manage priorities, technical decisions, and team operations when necessary.
  • Design, develop, and deploy full-stack solutions across our platform.
  • Help improve workflows, optimize sprints, and drive engineering best practices.
  • Architect and optimize systems to handle growing data and user needs.
  • Work closely with Product, Customer Success, and Leadership to deliver impactful features.
  • Conduct code reviews, provide feedback, and help grow technical skills across the team.

AWSBackend DevelopmentLeadershipProject ManagementPythonSoftware DevelopmentSQLCloud ComputingFrontend DevelopmentFull Stack DevelopmentGitSoftware ArchitectureTypeScriptREST APIReactCommunication SkillsAnalytical SkillsCI/CDProblem SolvingMentoringDevOpsTeam managementNodeJSData analytics

Posted 9 days ago
Apply