JavaScript, NodeJS, Ruby, Git, Docker, Kubernetes, AWS, PostgreSQL
JavaScript, NodeJS, Git, Docker, GCP, React, MongoDB, BigTable
Worked on fraud detection products built in NodeJS and hosted on GCP. This included making recomending and implementing changes to how Cloud BigTable was used, and changes to React front end for reviewing flaged orders
JavaScript, HTML, CSS, JSP, Git, Docker, AWS, React
Worked on optimizations in both CSS and JS. Developed enhancements to websites to improve ad revenue and subscriber retention. Worked on improvements and tooling around docker to help with development.
JavaScript, Perl, HTML, CSS, SVN, Git, Storm, Kafka
Over the many years I was with the company I worked on a variety of projects. I have created brand new mobile libraries for our main locator product, as well as enhancements to our existing front-end locator code. Some of those enhancements were to enable multiple waypoint driving directions, and a feature called “Along the Way” which allows the user to see locations that are near their chosen route. I have also set the architecture for newer responsive libraries built using Backbone.
In addition to front-end work I have also worked on enhancements to the existing back-end server code, written in Perl. One of the first back-end features that I worked on was to extend the ability to generate static maps to include driving directions maps. I’ve also made enhancements to our local landing pages system, which allows us to dynamically generate a web page for each location that a client has. Some of those enhancements include the ability to inject a small image gallery and ads onto the page, which are controlled by both the corporate client and the location managers.
Most recently I have been working on the architecture and development of a new Node.js server which uses Hapijs. This new server is designed for flexibility, scalability, and speed. Along side architecting a new Node.js code base for Brandify, I have also been leading the way in adopting Apache Storm and Kafka for processing real-time data and events. I designed and coded the first topologies, and I have trained other developers in the company in how to use Kafka and how to create and deploy Storm topologies.
Salesforce
Model Metrics was a provider cloud computing solutions using salesforce.com, Amazon, and Google. Most recently Model Metrics has been acquired by salesforce.com.
During my time at Model Metrics I primarily used the ETL (Extract Transform Load) tool, Talend, to do complex data migrations and integrations for clients. I also worked on enhancements to internal tools, written in PHP, that analyzed the quality of clients’ data in salesforce.com. One of those enhancements was to improve report generation speed by developing a way to generate the pages of the report in parallel.
Course Work: Operating Systems | Technical Communications | Data Structures | Software Engineering | Rich Internet Applications | Internet Technologies & Web Design | Service-Oriented Architectures | Cryptography and Network Security | Information Security | Database Organization |
Programing: JavaScript, Perl, Python, Java, Ruby, Go, Rust
Markup: HTML, CSS, JSON, XML, YAML, Markdown
Other: Apache Web Server, NGINX, Docker, Storm, Kafka, AWS, Google Cloud, ReactJS