Tecknoworks Europe

Tecknoworks Europe is a global technology consulting and delivery company that empowers clients to enhance productivity and profit through innovative technology solutions. Committed to creating a positive impact, Tecknoworks fosters personal and professional growth among its team members while serving a diverse range of clients from mid-sized businesses to large corporations.

Related companies:

Jobs at this company:

Apply
🔥 Data Scientist | ML
Posted 2024-10-24

📍 Romania

🔍 Technology consulting

  • Proficiency in programming languages such as Python, R, or SQL.
  • Experience with data manipulation and analysis libraries (e.g., pandas, NumPy, Scikit-learn, SciPy, OpenCV).
  • Familiarity with machine learning/AutoML frameworks (e.g., TensorFlow, Keras, PyTorch, JAX, Auto-Sklearn, Auto-PyTorch).
  • Knowledge of data visualization tools (e.g., Power BI, Tableau).
  • Proven track record of working with large datasets and developing predictive models.
  • Experience with cloud platforms (e.g., AWS, Azure) is a plus.
  • Strong problem-solving skills and analytical thinking.
  • Excellent communication and presentation skills.
  • Ability to work collaboratively in a team environment and with stakeholders.

  • Perform exploratory data analysis to discover trends, patterns, and insights.
  • Process, cleanse, and validate data integrity for analysis.
  • Mine data from valuable sources.
  • Develop and validate models/algorithms using machine learning techniques.
  • Optimize models for accuracy, efficiency, and scalability.
  • Collaborate with teams to translate business requirements into data-driven solutions.
  • Communicate insights to stakeholders through visualizations, reports, and presentations.
  • Stay updated with data science and machine learning advancements.
  • Seek opportunities to improve existing processes.

PythonSQLData AnalysisData MiningKerasMachine LearningNumpyOpenCVPyTorchTableauAlgorithmsData analysisData miningData sciencePandasTensorflowAnalytical SkillsProblem Solving

Posted 2024-10-24
Apply
Apply
🔥 Data Engineer | AWS
Posted 2024-10-24

📍 Romania

🔍 Technology consulting

  • Experience in integrating data into analytical platforms, ingestion technologies, data profiling, source-target mapping, ETL development, SQL optimization, testing, and implementation.
  • Experience in leading and deploying data solutions on cloud platforms like AWS using services such as Lambda, S3, Redshift, Glue, and DynamoDB.
  • Experience in large-scale data warehousing and analytics utilizing Snowflake.
  • Experience in programming languages like Python, Java or Scala.
  • Experience with databases (SQL and NoSQL), data processing frameworks (like Hadoop, Spark, and Kafka), and cloud platforms.
  • Experience in managing structured and unstructured data and driving value from big data and data modeling.
  • Track record of working in Agile teams to deliver results against challenging timescales.
  • Experience with Matillion is a plus.
  • Certifications in Cloud technologies and data engineering, e.g., AWS, Snowflake, and Data Warehouse, are a plus.

  • Collaborate constantly with architects, engineers, and business teams to understand business requirements and deliver solutions that meet the needs.
  • Build data pipelines, data transformations, processing, quality, crunch, and curation following documented business and technical requirements.
  • Develop analytical solutions, taking full responsibility for data engineering.
  • Build the data models and data flows for analytical solutions and integrate Data Science routines into the models as required.
  • Implement standard methodologies within the Analytical Solution department.
  • Stay current with the latest trends and technologies in cloud computing and data engineering.
Posted 2024-10-24
Apply
Apply

📍 Romania

🧭 Full-Time

🔍 Technology consulting

  • Prior experience designing, developing, and maintaining scalable data pipelines using Python and SQL.
  • Exposure to distributed data processing systems with Apache Spark and workflow orchestration using Apache Airflow.
  • Management of cloud-based data platforms using AWS and Azure technologies.
  • Strong SQL skills for querying in RDBMS and NoSQL systems.
  • Experience with DBT for data quality frameworks and understanding of data warehousing concepts.

  • Design, build, and maintain scalable data pipelines.
  • Collaborate with cross-functional teams on cloud platforms and distributed data processing frameworks.
  • Implement distributed data processing systems with Apache Spark.
  • Implement and manage ETL processes using Matillion.
  • Manage cloud-based data platforms integrating with Snowflake, Databricks, etc.
  • Design and implement data quality frameworks.
  • Work with data scientists and analysts to understand requirements.
Posted 2024-10-24
Apply
Apply
🔥 Data Analyst
Posted 2024-10-24

📍 Romania

🔍 Technology consulting

  • Proficiency in SQL with strong ability to write and optimize complex queries.
  • Proficient in Python for data manipulation, analysis, and automation scripting.
  • Hands-on experience with data visualization tools like Tableau or Power BI to build dashboards.
  • Understanding of ETL processes and experience in designing automated reporting systems.
  • Knowledge of data modeling concepts for building scalable reporting solutions.
  • Analytical thinking to break down complex business problems and design suitable data solutions.
  • Strong attention to detail for maintaining data accuracy and quality control.
  • Ability to work with cross-functional teams to understand business objectives related to data analysis.
  • Familiarity with cloud data warehouses like AWS Redshift or Snowflake is a plus.
  • Strong verbal and written communication skills for presenting findings to various stakeholders.

  • Extract, manipulate, and analyze large datasets using SQL and Python to provide data-driven insights.
  • Design, develop, and maintain automated reporting solutions and dashboards using data visualization tools like Tableau or Power BI.
  • Work closely with business stakeholders to understand their requirements and translate them into scalable reporting solutions.
  • Create and optimize SQL queries for efficient data extraction.
  • Support in automating data extraction, transformation, and reporting processes to enhance efficiency.
  • Collaborate with engineering and business teams to ensure data availability and quality.
  • Troubleshoot and resolve data discrepancies, reporting issues, and data integrity challenges.

PythonSQLBusiness IntelligenceData AnalysisETLTableauData analysisData scienceCommunication SkillsAttention to detailWritten communication

Posted 2024-10-24
Apply
Apply

📍 Romania

🔍 Technology consulting

  • 3+ years of experience as a DevOps Engineer or in a related role with strong experience in both AWS and Azure cloud environments.
  • Expertise in AWS services (EC2, S3, RDS, Lambda, VPC, IAM, etc.) and Azure services (VMs, Azure Storage, Azure Functions, Virtual Networks, etc.).
  • Proficiency in Infrastructure-as-Code (Terraform, CloudFormation, Bicep).
  • Experience with CI/CD tools (Jenkins, GitLab CI, Azure DevOps).
  • Strong scripting skills in Bash, Python, PowerShell, or similar.
  • Experience with containerization (Docker, Kubernetes, AWS ECS/EKS, Azure AKS).
  • Familiarity with monitoring tools such as AWS CloudWatch, Azure Monitor, ELK Stack, or Prometheus.
  • Knowledge of networking concepts: VPN, DNS, Load Balancing, Firewall Rules, etc.
  • Experience with security best practices: IAM, VPC Security Groups, Firewalls, Encryption, etc.
  • Strong problem-solving skills and ability to work independently or as part of a team.
  • Experience in agile methodologies and DevOps culture.

  • Design, configure, and manage scalable, secure, and highly available cloud infrastructure on both AWS and Azure platforms.
  • Build, maintain, and enhance Continuous Integration/Continuous Deployment (CI/CD) pipelines using tools like Jenkins, GithubI, or Azure DevOps for automated testing, deployment, and monitoring.
  • Develop infrastructure-as-code (IaC) using tools like Terraform, CloudFormation, or Bicep/ARM templates to automate cloud resource provisioning and configuration.
  • Implement and manage monitoring, logging, and alerting systems using services like AWS CloudWatch, Azure Monitor, ELK Stack, or Prometheus to ensure visibility into performance, reliability, and security.
  • Ensure cloud infrastructure follows industry best practices for security, including IAM policies, network security groups, firewalls, and encryption.
  • Manage containerized applications using Docker and orchestration tools like Kubernetes or AWS ECS/EKS.

AWSDockerPythonSQLAgileBashJenkinsKubernetesAzurePrometheusServerlessCI/CDAgile methodologies

Posted 2024-10-15
Apply