At Schwarz Global Services Barcelona, we provide high value IT services for the entire Schwarz Group, which includes Lidl, Kaufland, Schwarz Produktion, PreZero, Schwarz Digits, STACKIT, and XMCyber.
As part of a top 5 global retail company, we serve 6 billion customers through 13,700 stores in 32 countries, supported by over 575,000 employees.
We are looking for open-minded colleagues with passion for technology, who are willing to find diverse and exciting career opportunities in a dynamic work environment that stands for development and progress.
Elevate your career with us, where development and progress are at the heart of everything we do.
Your Tasks
-
Collaborate with other teams and build data services for data ingestion, processing and visualization of data product’s insights.
-
Integrate cloud providers and third-party tools to provide teams with a holistic overview of their cloud costs, code quality and software security
-
Provide essential services for the platform like billing data ingestion, configuring end-user data configuration and management portals, managing data contracts or defining data pipelines between different services.
-
Design, develop and implementation of data integration solutions supporting batch/ETL and API-led integrations that deliver tangible business value.
-
Proactively assess the current state of the technology and identify gaps and overlaps. Capture the future state of technology vision in actionable, context-specific roadmaps.
-
Develop policies around data quality, data security, data retention, data stewardship, and proactively identify and support project impacts
-
Serves as an expert level technical resource across multiple initiatives.
-
Work in a team-based environment including a global workforce, vendors, and third-party contractors.
-
Translate high-level business requirements into detailed technical specifications.
-
Collaborate closely with peers, offering mentorship and fostering a knowledge-sharing environment.
-
Continuously evaluate and advocate for advanced tools, technologies, and processes that drive industry best practices.
-
Actively participate in our Agile development processes, contributing to team meetings and delivering incremental improvements.
-
Collaborate with cross-functional teams to understand data requirements and deliver reliable data solutions.
-
Monitor data pipeline performance, troubleshoot issues, and implement optimizations to improve efficiency.
-
Assist in the design and development of APIs for seamless data integration across platforms.
Your Profile
-
Bachelor’s degree in computer science, Engineering, or a related field.
-
5+ years of Data engineering experience in designing/ building complex data pipelines supporting data analytics and data warehousing.
-
4+ years working with Data and relevant computation frameworks and system.
-
4+ years using Python programming language.
-
5+ years’ experience in writing SQL and query optimization.
-
3+ years of experience working with columnar databases such as AWS Redshift, Snowflake, Databricks, Vertica etc.
-
Nice to have: Master’s degree in computer science or engineering.
-
Nice to have: Cloud Technologies: Proven experience in a cloud computing environment, preferably GCP, Azure, AWS or similar.
-
Nice to have: Python: Practical development and data analysis experience using Python and/or PySpark
-
Nice to have: Advanced Data Processing: Knowledge in using data processing technologies such as Apache Spark, Flink or Kafka.
-
Nice to have: Workflow Management: Familiarity with orchestration and scheduling tools like Apache Airflow.
-
Nice to have: Experience with data reporting and visualization tools (e.g., PowerBI, MicroStrategy, Tableau or similar).
-
Nice to have: Experience with Agile data engineering principles and methodologies.
-
Nice to have: Exceptional problem-solving skills and willingness to learn new concepts, methods, and technologies.
-
Nice to have: Strong understanding of ELT methodologies and tools.
-
Nice to have: Experience in data warehousing and familiarity with data warehousing concepts and terminologies.
-
Nice to have: Capable of troubleshooting and conducting root cause analysis to address and resolve data issues effectively.
-
Nice to have: Analyze and develop physical database designs, data models and metadata
-
Nice to have: Communication: Outstanding communication skills, coupled with strong problem-solving, organizational, and analytical abilities.
We look forward to receiving your application.
Schwarz Dienstleistung KG · Laura Hernandez Costa · Reference no. 44241
Stiftsbergstraße 1 · 74172 Neckarsulm, Germany
www.careers.schwarz
Data Engineer (m/f/d)