Keep in touch with meI'm using Intch to connect with new people. Use this link to open chat with me via Intch app
Work Background
Senior Data Cloud Engineer
AlphaSix CorporationSenior Data Cloud Engineer
Jun. 2022United StatesCurrently providing MapR Hadoop Cluster Engineering/Administration support
Senior Big Data Cloud Architect
AT&TSenior Big Data Cloud Architect
Nov. 2019 - Jun. 2022St Louis, Missouri, United States• Responsible for data governance design and compliance architecture implementation. Manages the sourcing, Third Party/Supplier selection, and execution of agreements, in compliance with AT&T policies and operating practices. Work with various PM’s and business units to classify data, determine source, and provide data architecture and data strategy planning. • Responsible for various Azure and AWS Cloud POC coordination, architectural review, and automation of deployment. Solutions deployed on AWS POC’s have included Kubernetes, Grafana dashboards, Apache Nifi, Looker, and Snowflake. • Accountable for Terraform, AWS Lambda and CloudFormation automation development, testing, and deployments of a dozen AT&T environments. Standardized bootstrap process for Cloud account environments for optimal security, performance and scalability. • Lead the AT&T Astra Security remediation efforts with AT&T Astra framework applied to AWS and Azure for all CDO cloud accounts according to “CIS Amazon Web Services Foundations” Benchmark. Mature a pro-active cloud build-out process and set of standards where cloud infrastructure automation process is never built out in an in-secure manner but is always 100% Astra Cloud Security Compliant from the beginning, day 0 on projects. Developed AWS automation to enforce said Astra standards actively banning or shutting down cloud infra-structure that is built out of compliance. Adept with Python, Perl, PowerShell and Bash scripting.
Senior Big Data Engineer (Contractor)
CenturyLinkSenior Big Data Engineer (Contractor)
Nov. 2018 - Nov. 2019St Louis, Missouri, United States• Execute with expertise all Hadoop administration duties for the Cloudera CDH Enterprise Data Lake as needed. Duties included creating new Hive databases, creating new users, managing the cluster core services, upgrades, and data job execution management. • Worked with the various CTS groups (Technical Director, Technical Architect, Developers, Database Administrator, System Administrator, and Project Management) to create and support big data clusters and applications, completing all requirements and meeting usability expectations. • Estimated timeline for assigned projects and ensured that work was completed with the estimate given. Provided leadership in the administration and implementation of short and long-term technology plans and related policies and procedures. • Acquired the lead role in establishing and implementing standards that will facilitate a quality IT infrastructure all clients. Instructed junior admins in Hadoop daily core duties and aided by hands on examples as needed. • Responsible for Hadoop clusters including components such as Cloudera (CDH & CM), Kafka, Impala, Hive, Spark, HDFS, HBase, Oozie, Sqoop, Flume, Zookeeper, YARN, etc. Performance tuning of Hadoop clusters, job performance and capacity planning. • Management of TLS/SSL certificates and implementation of Kerberos. HDFS support and maintenance, Including archiving and management of data on an object store (S3A). Work directly with Vendors on escalations.
Senior Big Data Cloud Engineer
AT&TSenior Big Data Cloud Engineer
Dec. 2017 - Nov. 2018St. Louis City County, Missouri, United States• Responsible for upgrades to 45 small on-premises Hortonworks HDP 2.6 Hadoop clusters(15-20 nodes each) in Kingsmountain Domain that comprised the largest Big Data section of the AT&T Enterprise Data Lake. • Accountable for Terraform, Lambda and CloudFormation automation development, testing, and deployments of a dozen AT&T environments. Standardized bootstrap process for AWS account environments for optimal security, performance and scalability. • Spearheaded the AT&T AWS security remediation efforts with Dome9/Astra AWS software of entire environments according to “CIS Amazon Web Services Foundations” Benchmark. Adept with Python 2.7, Python 3.6, Node.js 6.10 programming. • Responsible for data governance design and compliance architecture implementation. Manages the sourcing, Third Party/Supplier selection, and execution of agreements, in compliance with AT&T policies and operating practices. • Communicates and distributes sourcing, contracting, and compliance policies and procedures to applicable business unit leadership and other personnel regarding data governance. Assists business units with contract negotiations if high risk areas have not been properly covered in the contract or if risk mitigation is required.
Senior Technical Architect (Senior Consultant)
Charter CommunicationsSenior Technical Architect (Senior Consultant)
Jul. 2017 - Dec. 2017Maryland Heights, Missouri, United States• Lead the Hadoop Administration team in planning, engineering and build out of new clusters, and administration of four existing 50-200 node HDP clusters. Design and document the plan for auditing, security, encryption at rest, and user and group management for all Hadoop clusters. Build and design security polices with Apache Ranger, Yarn Ques, configure Kerberos, orchestrate and execute complex maintenance's in multiple hybrid environments. • Architect new data process flow and data ingestion for the Viewership Data Lake, standardize on data ETL tools, processes, and methods for data validation, data reconciliation, data visualization and aggregation. • Technical team lead and SME for architecting migration of 135 node Hadoop cluster into AWS Cloud, via HDP 2.6 Cloudbreak on EC2 instances, AWS configuration of VPC’s, EBS, S3, Security Groups, IAM management, Route 53, Cloudwatch configuration, Kinesis configuration for streaming data, RDS and DynamoDB integration. Budget forecasting and future planning via AWS Trusted Advisor, and AWS Billing Reports.
Senior Big Data Architect Consultant
Caterpillar Financial Services CorporationSenior Big Data Architect Consultant
Feb. 2017 - Jul. 2017Peoria, Illinois Area• Engage and act as technical team lead for Data Migration Strategy to the Cloud (Both AWS and Azure). Capacity planning, staging, layout and hands on build for infra-structure migration, and integration. Plan, build and deploy new five layer end to end monitoring system utilizing custom written applications, writing to a MySQL database with live streaming data to Tableau custom dashboards. • Provide expert technical guidance on existing and new Hadoop architectural design issues, ensure standards are being followed, processes in place are able to resolve the issues at hand. Coach and mentor junior associates on complex technical issues as they arise, review existing Hadoop developers designs an architecture for oozie workflows, ETL jobs, streaming data ingestion, and other design points involving data governance, data modelling, data security, data processing, data reconciliation & data delivery. • Create small library of custom python and bash scripts for more effective cluster command line management, custom monitoring for hardware and cluster issues, and develop vendor neutrals tools so that migrating to the Cloud will become easier. Implement ETL standards utilizing proven data processing patterns with open source standard tools like Talend and Pentaho for more efficientprocessing and repeatable deployments across multiple tenants, multiple customers and multiple environments.
Senior Consultant
MastercardSenior Consultant
Nov. 2016 - Feb. 2017Greater St. Louis Area• Function as subject matter expert , and consultant to lead and develop the overall big data strategy solution and technical roadmap deliverable that encompasses and unifies various groups within a large financial customer for combing real-time, near-real-time , streaming data, and batch processing into a single comprehensive integrated platform that performs fraud detection, BIN profiling, Customer authentication, implication scoring, and macro fraud pattern matching and other data product driven applications. • Provide new HBase cluster layout, cluster division and partitioning, hardware and software recommendations, construct a cluster framework calculation software tool and develop strategies for Cloudera Enterprise CDH 5.5 Resource Pools, Static Service Pools, and HDFS traditional and Sentry (Cloudera) controlled multi-tenancy and group separation. • Align the technical requirements with the business goals objectives of all customer groups and stakeholders involved. Effective professional presentation and data strategy message delivery to customer fraud group senior management and company overall senior management. Successfully field all questions concerning the technical solution, business strategy, and cost economics and overall planned schedule and road map. • Direct experience implementing all aspects of installation and configuration for Hadoop clusters, building cloud architecture, troubleshooting as needed, capacity planning, and Hadoop and system administration. Implementation experience around ETL development, writing code in perl, python, bash, and some scala code as needed.
Senior Big Data Architect
UnisysSenior Big Data Architect
Nov. 2014 - Oct. 2016Greater St. Louis Area• Serve as expert, engineering team lead and Big Data SME for BDAaaS group within AMCOE data center, providing team guidance and direction around administration, development, business use cases, hardware recommendation and selection, best practices, industry standards, performance tuning and monitoring, security standards and policies regarding Unisys Big Data environments. • Provide high level, and detailed design plans and documentation for Hortonworks HDP 2.2 Hadoop Environments, Cloudera CDH 5.1 environments, and other Hadoop distro clusters, network diagrams and layouts, logical and physical architectural documentation. Assess and review new Big Data business intelligence software tools and vendors. Research and analyze Big Data industry trends and patterns, security, new vendors, and new open source software solutions, write internal white papers, and provide peer review regarding overall design and Big Data environment stability. • Lead team with standardization for installation and configuration of BDAaaS environment build outs, providing seamless automation via custom puppet modules, chef recipes, and or python scripts as required. Construct and drive project schedules, timelines, and project lifecycles to acceptable levels of performance and closure. Train junior team members in Hadoop core competencies and fundamentals. Assist in Hadoop developers in the ETL process, responsible for data ingestion orchestration, data import/export, data governance, data protection and security polices and implementations. Hands on Hadoop administration, and Java application troubleshooting, environment updating and patching on a regular basis, environment maintenance scheduling and migration planning.
Senior UNIX Application Engineer
CenturyLinkSenior UNIX Application Engineer
Apr. 2009 - Oct. 2014Greater St. Louis Area• Daily duties of Opsware Server Automation, constructing software policies for Opsware, Redhat satellite server automation, and managing the VMware lab hosts and clusters and attached storage utilizing Vmware Virtual Infrastructure Client on VMserver 3.5, also managing and configuring VMware ESX 3.5 Server on HP Proliant DL380 G5 platforms. • Consult with vendors for SaaS WEB 2.0 offerings that Savvis will partner with to productize, and utilize in their cloud computing model. Analyze application design strengths, weaknesses, utilization, data collection methods, scalability, security model, redundancy and high availability features, authentication and identity methods, programming languages implemented. Evaluate software by installation and configuration of demo or evaluation software license onto Savvis standard OS builds in the lab. Provide concise and timely reports to upper management for critical decision making process to be completed based on sound technical and logical study of the application. • Testing with Jmeter, Apache Benchmark (ab) and other tools, to conclude application benchmarking to Savvis’s stringent application standards. • Architect for multiple SaaS offerings, including Omniture Web Analytics (SiteCatalyst), Precise Application Performance Monitoring (APM), Savvis Intelligent Agent, Tomcat, Jboss, and productizing IBM Websphere. • Team lead for the Tomcat, Jboss and Apache productization line. Conduct extensive research, set standards, researched and develop best practices, tomcat suite of management tools, for installs, deployment methods, and automation on a large scale for Savvis standard platforms. Building RPM’s for the Savvis Tomcat Opsware Policies, creating software toolkits, and advanced customized tools via perl, and bash shell scripts.
Senior UNIX/Linux System Engineer
CenturyLinkSenior UNIX/Linux System Engineer
Apr. 2006 - Apr. 2009Greater St. Louis Area• Systems Engineer on 100 million dollar, 400 websites migration project for key corporate customer . Charged with initial setup, system & network configuration, kernel tuning, rpm management, DNS server/client and sendmail setup, basic system configuration, user account management, setup of ssh, sftp, snmp trap config, and quality assurance of some 60 RHEL, AS 3.0 & 4.0 servers, and 22 Windows 2003 Servers, initial work accomplished and concluded for all servers within 27 days. Migration of 400 websites completed in 6 months. • Hardware configurations and troubleshooting for Linux and Windows 2003 servers on egenera bladeframe, and standalone servers. Tasks included installation of applications, configuration of services, user and group admin, system security, disk management, netbackup configuration, device management, system resource allocation and monitoring. Some administration of IIS , and troubleshooting of Windows based websites .
(Senior Consultant/Staff Manager Ajilon)ServerOps Shift Lead
AT&T(Senior Consultant/Staff Manager Ajilon)ServerOps Shift Lead
Apr. 2004 - Apr. 2006Bridgeton, Missouri• Responsible as part of tier 2 level support team and tasked with remote monitoring and initial analysis of critical alarming for software & hardware matters for 1100 various Solaris and Linux servers across the country through a Remedy ticketing system & HP OVO implementing remote fault management. Troubleshoot critical system, network and application failures, identify and prioritize SLA outages, restoration of services and escalation of issues to AT&T Technical Development team with detailed conclusions. • Performed remote, and local hardware administration, troubleshooting failure hardware components with tier 4, and or vendors. Worked with all manner of Sun SPARC hardware, and performed a wide array of tasks, including replacing memory modules, CPU’s, system boards, backplanes, Quad card NIC’s, SCSI Disks. • Performed emergency demand maintenance as required, and scheduled maintenance as needed. Wrote scripts to make system metrics collection into more human readable format.

Requests

Touchpoint image
3
Personal Pitch
Efficient Data Architecture Solutions
Intch is a Professional Networking App for the Future of Work
300k+ people
130+ countries
AI matching
See more people like Roger on Intch
IT
453430 people
18
Technologist, Project/Program Manager
24
Data Scientist Intern @ Newell Brands
16
Program Manager @ DISH Network
ITSecurity Analyst
25270 people
17
Major Crimes Det/CFCSI (Ret.) Business Owner, MBA @ CID / U of A / Rahbuilds.com
16
Programmer @ Castro Studios
23
Director