EndavaDatabase/DevOps Engineer
Oct. 2021Romania
Technologies and tools used during the years
• PostgreSQL, Oracle, MS SQL, MongoDB
• SQL, PL/SQL, PL/pgSQL
• Java, Bash
• Oracle SQL Developer Data Modeler
• Ansible, Terraform, Docker, ActiveMQ, Nginx, Nagios, Grafana, Prometheus
• Microsoft Azure, AWS
• DBeaver, PL/SQL Developer, Oracle SQL Developer
• Git, GitLab
Personal projects below:
ActiveMQ implementation in the OpenEMR stack
Duration: 07.2024 — till now
EMR is a medical billing application designed to streamline financial operations in healthcare
management. The project aimed to integrate ActiveMQ into the application's stack to improve
message transfer between the application and the PostgreSQL database.
Team
2 specialists
Position
DBA/DevOps Engineer
Responsibilities
• Implementing an application stack using Docker, consisting of the application, ActiveMQ, Nginx,
and a PostgreSQL database
• Leveraged Ansible to automate the configuration and deployment of Docker environments and the
application stack
• Employed Terraform to deploy and manage virtual machines, ensuring a scalable and repeatable
infrastructure
• Conducted in-depth analysis of database performance, identifying bottlenecks and areas for
improvement
• Implemented performance tuning techniques to optimize database queries, indexing, and resource
usage
• Managed code versioning using Git/GitLab, ensuring proper version control and collaboration
across the development team
• Created detailed architecture diagrams to visually represent the structure and components of
deployed applications
• Conducted thorough testing of deployed applications and their components to verify proper
operation and performance
Partitioning of a buyline table Duration: 01.2024 — 03.2024
Some large queries triggered by the application were only accessing a few days (or up to one
month) of recent data, referred to as "hot data." To improve the overall application performance,
particularly for these high-consumption queries, we decided to implement range partitioning (by
month) for a big table.
Team
15 specialists
Position
PostgreSQL DBA
Responsibilities
• Collaborated with cross-functional teams to diagnose and address performance issues impacting
database operations
• Led discussions to evaluate various optimization strategies, ultimately deciding that range
partitioning was the most effective solution
• Conducted extensive testing of the range partitioning solution in lower environments, replicating
data volumes similar to production
• Analyzed the performance of problematic queries and execution plans to validate the effectiveness
of the partitioning strategy
• Negotiated a downtime window that minimized disruption to business operations while allowing
sufficient time for the implementation
• Executed the range partitioning change in the production environment, carefully monitoring the
process to ensure a smooth implementation
• Continuously monitored database performance post-implementation, analyzing key metrics to
assess the impact of the change
• Coordinated with the application team to evaluate performance improvements from their
perspective, ensuring alignment between database and application optimizations
Technologies and tools
• PostgreSQL, Bash scripting
Migration of database information from PostgreSQL to
MongoDB
Duration: 01.2024 — 03.2024
The purpose of the migration was to move audit tables to MongoDB to achieve better horizontal
scaling. The goal was to improve the performance of insert and select statements by using
sharding (spreading the audit collection across multiple servers).
Team
3 specialists
Position
PostgreSQL DBA
Responsibilities
• Collaborated with the customer to thoroughly discuss and validate the benefits of the proposed
solution, ensuring alignment with their business goals.
• Presented technical details and expected outcomes, addressing any questions or concerns the
customer had regarding the implementation
• Deployed the solution on AWS, leveraging the Database Migration Service (DMS) to facilitate
seamless data migration
• Configured the DMS instance and established two connectors, one for the source database and
one for the target MongoDB database
• Ensured that the connectors were correctly set up to maintain secure and efficient data transfer
between the two environments
• Developed and executed a replication task within DMS to migrate both existing data and ongoing
modifications into the designated MongoDB collection
• Coordinated with the development and operations teams to validate the successful migration of
data and to troubleshoot any issues that arose during the process
Technologies and tools
• PostgreSQL, MongoDB, AWS
ZFS Filesystem replication for all preprod databases
Duration: 07.2023 — 12.2023
The native PostgreSQL solution of restoring a database from production to preprod daily was
taking too much time, especially for large database clusters. As an alternative, we implemented
a solution using ZFS filesystem snapshots for incremental restores. Instead of restoring a full
backup, we relied solely on daily incremental restores, significantly reducing the time required for
the process.
Team
10 specialists
Position
PostgreSQL DBA
Responsibilities
• Tested ZFS replication (replicated databases using ZFS incremental snapshots) to determine if the
solution suited the use case
• Wrote Bash scripts for the implementation, integrating them with other components
• Planned the entire implementation process, provided deadlines to the customer, and allocated
team members from both the application and DevOps teams
• Attended meetings with other teams to ensure the solution met their needs
• Built a logging system for the replication script
• Created cron jobs across all environments to replicate data as a whole, ensuring synchronization
with the same timestamp
• Created Confluence documentation to record the entire process
• Assisted the DevOps team in integrating the replication process into Ansible
• Gave a presentation to multiple teams about the entire replication process
Technologies and tools
• PostgreSQL, Ansible, AWS, Bash scripting, VS Code
PostgreSQL Migrations from version 11 to version 13
Data Academy presentation Duration: 11.2022 — 02.2023
"Oracle Database Performance in Production Systems" presentation. It consisted of two parts: a
theoretical section and a hands-on workshop.
Team
1 specialist
Position
Oracle DBA
Responsibilities
• Documented the presentation, including the scope and structure, to ensure the audience
understood the key concepts
• Created the PowerPoint presentation, covering topics such as performance and response
time, potential production issues related to performance, execution plans, indexes, DMLs, and
partitioning
• Delivered the theoretical part of the presentation
• Conducted the hands-on session: created a scenario with large tables, both partitioned and
non-partitioned, implemented indexes, wrote queries, generated execution plans, and provided
explanations
• Ran an SQL trace on a query and explained the TKPROF output
Technologies and tools
• Oracle Concepts, Oracle SQL Trace, TKPROF