Responsabilidades
- Design and Development:
○ Design, develop, and maintain scalable and efficient data pipelines.
○ Implement data integration solutions to combine data from various sources into a
cohesive data warehouse.
○ Develop ETL (Extract, Transform, Load) processes to transform raw data into
useful formats for analysis.
● Data Management:
○ Ensure data quality and integrity by implementing data validation and cleaning
processes.
○ Optimize database performance through indexing, partitioning, and query
optimization.
○ Manage and maintain data storage solutions, including data warehouses and
data lakes.
● Collaboration:
○ Work closely with data scientists, analysts, and other stakeholders to understand
data needs and requirements.
○ Collaborate with software engineers to integrate data solutions into applications
and systems.
○ Developing and Maintaining Documentation: Creating and maintaining
comprehensive documentation for data systems, processes, and workflows to
ensure clarity and facilitate knowledge sharing.
Mentoring and Leading: Providing guidance and mentorship to junior data
engineers, leading projects, and contributing to the strategic direction of data
engineering initiatives.
● Innovation and Improvement:
○ Stay current with emerging technologies and industry trends to continuously
improve data engineering practices.
○ Identify opportunities to enhance data infrastructure and implement best
practices for data management.
○ Lead initiatives to automate and streamline data engineering processes.
● Security and Compliance:
○ Ensure data security and compliance with relevant regulations and standards.
○ Implement data governance policies to maintain data privacy and protection.
Requisitos
Required Qualifications
● Bachelor’s or Master’s degree in Computer Science, Information Technology, or a
related field.
● 5+ years of experience in data engineering or a related role.
● Proficiency in SQL and experience with database management systems (e.g., MySQL,
PostgreSQL, Oracle).
● Experience with big data technologies such as Hadoop, Spark, or Kafka.
● Experience using data orchestration tools such as Airflow, Dagster, Prefect
● Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) and related services
(e.g., Redshift, BigQuery).
● Strong programming skills in languages such as Python, Java, or Scala.
● Experience with data modeling, data warehousing, and building ETL pipelines.
● Knowledge of data governance, data security, and compliance practices.
● Excellent problem-solving skills and attention to detail.
● Strong communication and collaboration skills.
Preferred Qualifications
● Experience with machine learning and data science workflows.
● Familiarity with containerization (e.g., Docker, Kubernetes).
● Experience with real-time data processing and stream analytics.
● Knowledge of CI/CD pipelines and DevOps practices.
● Certification in relevant technologies or platforms.
¿Estás buscando ofertas laborales?
En Talently publicamos las mejores ofertas del sector tech, suscríbete a nuestro boletín o crea tu perfil para estar al tanto de ellas.