Built the Redis + Nginx caching layer that kept The Daily Beast online through the 2016 election — 120k requests per second — plus a CMS and React component library the editorial team still used after I left.
During the 2016 US election, traffic to The Daily Beast spiked to 120k requests per second. The site couldn't hold. Beyond the traffic problem, the newsroom was drafting articles in Google Docs and handing them off — collaboration was slow, versioning was informal, and publishing was manual.
During the 2016 US election, traffic to The Daily Beast spiked to 120k requests per second. The site couldn't hold. Beyond the traffic problem, the newsroom was drafting articles in Google Docs and handing them off — collaboration was slow, versioning was informal, and publishing was manual.
The fix had to hold up under election-night traffic without a rewrite, and the CMS had to ship fast enough to matter before the next news cycle.
Implemented an article-caching layer using Redis and Nginx — articles served from cache, invalidated on publish, holding 120k rps without origin pressure. Built a CMS so writers could draft, review, and publish inside one tool instead of emailing Google Docs around.
Developed a React UI component library used across the product. Integrated Optimizely and PPC partners (Outbrain, Taboola). Shipped an infinite-scrolling pattern that measurably increased pages-per-visit, time-on-page, and PPC ad revenue.
The caching strategy kept the site serving readers through the highest-traffic event of the year. The CMS replaced Google Docs as the editorial workflow. The component library became the shared UI vocabulary across product surfaces.