Keep in touch with meI'm using Intch to connect with new people. Use this link to open chat with me via Intch app
Work Background
Data Engineer
Calibre ScientificData Engineer
Mar. 2024 - Apr. 2025United StatesSAP Datasphere: Managing complex data flows, orchestrating data integration, and preparing data for insertion into SAP Analytics Cloud. Successfully optimized data integration processes, reducing latency. SAP Analytics Cloud: Preparing and structuring data for seamless integration, enabling advanced analytics and real-time dashboards to enhance decision-making accuracy. SQL Server Management: Performing comprehensive data extractions and scripting automation for daily tasks, significantly reducing manual effort. Airflow: Implementing and managing robust workflows for scheduling and automating data pipelines between SAP Datasphere and various platforms, including Snowflake and Databricks, improving pipeline reliability. Databricks: Leveraging Databricks for scalable data processing and analytics, contributing to the development of predictive analytics models that improved forecasting accuracy. BigQuery: Developed optimized queries for handling large volumes of data, creating and managing databases to support detailed analytical processes. PostgreSQL to GCP Migration: Successfully migrated on-prem PostgreSQL databases to Google Cloud Platform (GCP) utilizing Airflow for efficient orchestration. API: Integrated and automated business processes using Zoho API, streamlining and enhancing workflow efficiency and ensuring comprehensive reporting for future maintenance. Automation: Automated and improved business workflows through advanced scripting and process optimization, significantly increasing operational efficiency. Excel to Views Optimization: Optimized Excel-based reports by transitioning them to database views, improving reporting efficiency and accuracy.
Data Engineer
Aubay PortugalData Engineer
Nov. 2022 - Jan. 2024Lisbon, PortugalSAP HANA: Advanced management of procedures and triggers in SAP HANA, ensuring consistent performance, operational stability, and integrity of critical data. SAP HANA to Snowflake Migration: Technical leadership in the strategic migration of SAP HANA databases to Snowflake, leveraging Airflow to automate processes, ensure operational continuity, and significantly reduce transfer times and associated risks. Airflow: Development and management of robust and scalable data pipelines using Airflow, optimizing data flows and improving reliability and performance across multiple platforms and services. dbt Cloud: Strategic implementation of dbt Cloud for data transformation, modeling, and governance, enabling agile integrations, accurate analytics, and rapid deliveries to business areas. ELT Architecture: Design and maintenance of a structured ETL architecture with clearly defined layers (Stage, Transform, Test, and Product Layers), enhancing quality and standardization of deliverables. Utilized Jira for agile project management and transparent tracking of tasks and project demands.
Data Engineer
AccentureData Engineer
Nov. 2020 - Nov. 2022São Paulo, São Paulo, BrasilETL Development (WhereScape): Designed and implemented ETL processes integrating various data sources using WhereScape, developing robust dimensional and fact tables within a Star Schema model to support complex analytical requirements. Snowflake Platform: Architected and developed advanced ETL processes leveraging Snowflake, ensuring scalable, high-performance data integration and reliable reporting mechanisms for ongoing maintenance. Big Data Integration (PySpark/AWS Glue): Managed the collection, integration, cleansing, and consolidation of structured and unstructured data using PySpark and AWS Glue, ensuring data quality and consistency. Legacy vs. Cloud Model Comparison: Performed comprehensive comparisons between legacy data models and new cloud implementations to validate data accuracy, reliability, and integrity. Splunk ITSI Implementation: Captured server logs and established KPI monitoring with Splunk ITSI, proactively detecting service degradation through machine learning algorithms and historical analysis. Developed automated triggers for self-remediation using Python integrated with Splunk ITSI. Big Data Pipeline Design: Engineered end-to-end big data processing pipelines, including data capture from Yarn API using PySpark, non-relational data storage in MongoDB, and data processing with PySpark. Managed log transportation through Apache Kafka. Control-M & Jenkins: Scheduled and executed batch workflows using Control-M and streamlined continuous integration and deployment (CI/CD) processes via Jenkins, enhancing efficiency and reducing deployment cycles. Process Automation (Pipefy): Automated business workflows in Pipefy using Python scripts and GraphQL, significantly improving operational agility and reducing manual intervention.
Data Analyst (Big Data/Data Science)
MAGNA SISTEMASData Analyst (Big Data/Data Science)
Aug. 2019 - Oct. 2020São PauloComputer Vision Model: Trained a computer vision model for vehicle identification using Python, enhancing the efficiency and accuracy of identification processes. Data Transformation and Ingestion: Developed robust data transformation processes in Python, facilitating efficient ingestion into SQL Server. Data Visualization with Power BI: Created interactive reports and dashboards using Power BI to enable rapid analysis and data-driven decision-making. Digitizing Incident Reports: Transformed paper-based incident reports into digital formats using advanced computer vision techniques with Python, optimizing administrative processes. Machine Learning with IBM Watson: Trained a machine learning model using IBM Watson to capture operational patterns, significantly improving the organization's analytical and preventative capabilities. Google Maps API: Leveraged the Google Maps API to identify regions and incident areas, enriching analytical model training and enhancing analysis precision. Big Data Processes (Python and OracleSQL): Developed comprehensive processes for data ingestion, transformation, and consumption in Big Data environments using Python and OracleSQL, ensuring scalability and efficiency. Technical Documentation: Created detailed documentation for Python scripts to ensure clarity and ease of future maintenance.
Technical Support Analyst
Global HitssTechnical Support Analyst
Dec. 2017 - Aug. 2019São Paulo e Região, Brasil

Requests

Touchpoint image
2
Looking for a Job
Data Engineer: Expert in Big Data Solutions
Intch is a Professional Networking App for the Future of Work
300k+ people
130+ countries
AI matching
See more people like Lucas on Intch
IT
453430 people
17
IT Project Manager @ Freelancer
26
Mobile Engineer @ Raiô Benefícios
15
.Net Fullstack Developer/Angular Developer @ Webfis Processamento de Dados Ltda
ITData Analyst
69955 people
16
Data Analyst / Engineer @ SoliDeo Glória Data Analytics
15
DevOps Engineer
20
Data Engineer | Analytics Engineer | Data Architect @ Avenue Code