Laguna Games 2021 - Present
Building really cool things in Elixir!
Phoenix Labs 2018 - 2019
Joined the rapidly growing studio to assist the platform team in scaling services to meet the 50x load
we would encounter during the launch of Dauntless's open beta. Then joined the live operations
team after launch to identify their needs and build several tools to assist in the day-to-day maintenance
of the game. Finally, transitioned to the analytics team to spearhead efforts to substantially
improve how the team and studio at large would access and relate to our data. As the first data engineer,
I turned unreliable scripts into a scalable and reliable data pipeline, automated the majority of manual
work, and standardized processes and tools. The improvements translated directly into team productivity
and effectiveness, allowing us to affect change at an organizational level.
Migrated unmaintained social service from AWS to GCP and took ownership.
- Wrote supporting infrastructure in Erlang and Python to integrate with our platform.
- Wrangled hairy build processes around Jingle and Strophe to improve client-side.
- Performed extensive load testing and tuning to ensure it could cope with load at launch.
- Communicated intricacies of XMPP to colleagues to assist client-side work.
- Setup monitoring and alerts to ensure reliability of social functionality.
Built several internal tools from scratch to improve colleague’s day-to-day effectiveness.
Account cloning tool.
- Used daily by testers. Saved them countless hours.
- Used to reproduce several bugs only seen in production.
Broadcast messaging tool used during incidents and deploys.
Dashboard and inspection tool to understand the state and health of the player base and servers.
Diagnostic tooling to track down client bugs in production.
- Deeply integrated into Unreal Engine.
- Setup so diagnostics to be toggled remotely.
All the tools were integrated with Active Directory to make them as easy as possible to use and administer.
Built a horizontally scalable service to track users and key metrics in real-time.
- Automatically scales to meet demand by utilizing Kubernetes.
Relied upon to automatically detect live issues.
- Detected service outages and degradations before players noticed.
- Architected and implemented both the client (C++) and server-side (Python).
Owned the telemetry pipeline end-to-end.
Rebuilt to handle up to millions of events per second in near real-time.
- Required a patch for Redis.
Several emitters, including our website, patcher, game, and services.
- Greenfield, requiring me to architect and implement both the client (C++) and server-side (Python).
- Tracked down and eliminated several issues causing issues for analysts.
Built a reliable data pipeline from scratch.
- Setup to handle ~100 terabytes of data.
- Collated data from ~15 sources across internal and external systems.
- Made it simple to build complex multistage pipelines and taught analysts how to use it.
- Supported "schemaless" data by automatically detecting schema from streams.
- Reduced latency from manual (weeks) to days and then one hour.
- Set and met SLA to ensure data was always available for analysis.
Assisted with data analysis.
- Analyzed, interpreted, and reported on large datasets to deliver insight that weighed into business decisions.
- Built and maintained dashboards to report metrics studio-wide.
- Used knowledge of SQL to restructure data and queries to reduce complexity and improve performance.
Took on data stewardship.
- Introduced metadata that became vital to all analysis.
- Normalized a lot of poorly structured or broken data.
- Assisted others in the hookup and verification of telemetry.
Acted as de facto database administrator.
- Go to for things database related.
- Fixed several deadlocks causing player facing issues.
- Wrote several scripts to perform migrations and fixups.
- Identified data corruption and loss then lead recovery efforts.
Walter 2015 - 2017
Worked with founding team to identify and determine feasibility of an investor search tool by building
and selling a prototype, then grew the engineering team and led the engineering effort to take the
product to the next level. Focus was on the architecture and development of backend that continuously
collected and analyzed data to match founders to investors and vice versa.
Built a prototype and iterated until product-market fit.
- Built an end-to-end prototype in less than a month.
- Built and optimized sales and marketing funnel.
- Defined strategy and tactics with the founding team.
- Ultimately assisted ~50 founders raising $58m in early stage rounds.
Built and directed a team of 3.
- Built mutual trust with my team.
- Fostered conversation around the product and engineering.
- Established continuous deployment, automated testing, and code reviews.
Developed and ran a hiring process from end to end.
- Utilized unconventional tactics to reach many high-quality candidates with ~$100 spend.
- Developed an interviewing format that challenged candidates.
- Took 3 weeks from requisition to hire.
Learned a lot!
Architected and developed a system for building a knowledge graph.
- Collated and reconciled several large data sources to produce a database of people, organizations, and their behaviour.
- Built several algorithms to derive relationships and their strength.
- Prototyped a novel database inspired by Datomic to simplify management of the dataset.
- Lead frontend and backend developers to expand the graph and its use.
Built robust crawlers for collecting information about private companies.
- Crawled 10s of millions of pages across several sources.
- Designed crawlers to robustly handle website redesigns.
- Reverse engineered undocumented APIs to extract even more data.
- Identified and fixed bugs in Scrapy (a popular toolkit for crawling).
Built just-in-time investment and social graph analysis service.
- Codified and automated manual processes to scale as a business.
- Automatically generated customized insights in milliseconds.
- Leveraged deep knowledge of Erlang and PostgreSQL to optimize complex algorithms and queries.
- Experimented with the application of novel research to reduce searches to constant time.
Introduced machine learning to enrich data and enhance predictions.
- Used to parse natural language into machine understandable features.
- Preliminary work to build, train, and iterate on models to assist our predictive algorithms.
CrossPacific Capital Partners2014 - 2015
Worked with a cross-functional team to identify, research, and prototype potential products and
businesses relating to cryptocurrencies with a focus on improving the way content creators are
compensated for their work.
Bex2013 - 2015
Joined the team to extended and maintain the backend logic of a whitelabel cryptocurrency exchange built
in Elixir (which had yet to 1.0) and occasionally assist with frontend tasks. Also assisted in the
development and management of relationships in business and academic environments.