High-Throughput FIDO2 / WebAuthn Trust Server
Context: A high-security fintech environment requiring phishing-resistant authentication for millions of users.
Constraints: Must achieve FIDO Alliance functional certification, support legacy fallback, and handle high-throughput assertion requests.
Approach: Built a custom WebAuthn server from scratch—achieving a 100% functional conformance pass—to ensure full control over the cryptographic implementation and FIDO Metadata Service (MDS) integration. Implemented a robust state machine to parse complex attestation statements and enforce strict authenticator security policies.
Key Lessons: Deep integration with hardware authenticators requires more than just spec compliance; it requires extensive real-world testing against hundreds of device types and navigating complex transaction signing flows.
On-Premise RAG Service Architecture
Context: The need to leverage Large Language Models (LLMs) for internal knowledge retrieval—specifically to help developers understand internal frameworks—without exposing proprietary codebase and architectural data to external APIs.
Approach: Designed a production-grade Retrieval-Augmented Generation (RAG) service utilizing local inference engines and enterprise-grade multi-GPU servers. Engineered the infrastructure to support and optimize high-parameter models (32B and 70B) under strict memory and latency constraints.
Impact: Delivered a secure, internal AI assistant capable of contextualizing internal documentation and drastically reducing the onboarding time for new engineering hires.
Enterprise Core Framework
Context: A growing development organization requiring a standardized, highly performant baseline for all new microservices to reduce technical debt.
Approach: Architected a standard core framework leveraging modern Java, Spring Boot, and GraalVM Native Image for optimized startup times and low memory footprints. Automated the deployment pipeline, secrets management, and server hardening using Ansible and Kubernetes (MicroK8s).
Impact: Significantly reduced boilerplate code, standardized security practices across teams, and created a resilient infrastructure baseline capable of handling complex legacy data migrations (e.g., MongoDB upgrades).
Hybrid Post-Quantum Cryptography Deployment
Context: Research and pilot implementation of quantum-resistant algorithms for a financial institution's internal communication.
Constraints: Need to maintain compatibility with existing TLS stacks while introducing ML-KEM and ML-DSA, including configuring PQC in Nginx.
Approach: Developed a hybrid approach where transactions and connections are secured by both classical algorithms (ECC/RSA) and post-quantum algorithms (ML-DSA/ML-KEM). This ensures security even if one algorithm is compromised.
Key Lessons: The performance overhead of PQC is manageable, but the increased key and signature sizes require careful adjustment of network buffer sizes and backend architectures.