Data Engineer
Swish Analytics
Company Overview
Swish Analytics is a sports analytics, betting and fantasy startup building the next generation of predictive sports analytics data products. We believe that oddsmaking is a challenge rooted in engineering, mathematics, and sports betting expertise; not intuition. We're looking for team-oriented individuals with an authentic passion for accurate and predictive real-time data who can execute in a fast-paced, creative, and continually-evolving environment without sacrificing technical excellence. Our challenges are unique, so we hope you are comfortable in uncharted territory and passionate about building systems to support products across a variety of industries and consumer/enterprise clients.
Role Overview
You'll be building trading infrastructure for sports betting platforms, leveraging our proprietary fair value models to gain trading edge. This is a high-impact role where you'll architect and deploy low-latency systems that make real-time trading decisions.
Core Responsibilities
Trading Infrastructure Development
-
Build and optimize market-making engines that consume real-time order book data and execute trades across multiple platforms
-
Develop low-latency data ingestion pipelines for market data feeds (WebSocket, FIX protocol, REST APIs)
-
Create real-time edge analysis systems that compare our fair values against live market prices to identify profitable opportunities
-
Implement order management systems with robust error handling, position tracking, and risk controls
Production Systems & Operations
-
Deploy and manage containerized applications on Kubernetes
-
Build automated testing, deployment, and rollback procedures for trading systems
-
Design and implement post-trade analysis tools to evaluate strategy performance
-
Handle production incidents and optimize system performance under load
Required Qualifications
Technical Fundamentals
-
3+ years of production Python experience with strong async programming skills
-
Deep understanding of Python's async/await patterns, event loops, and concurrent execution
-
Experience building and maintaining production services in Linux environments
-
Strong system design skills for distributed, real-time data processing systems
-
Proficiency with SQL databases and data modeling
Infrastructure & Tools
-
Experience with containerization (Docker) and orchestration (Kubernetes)
-
Familiarity with message streaming platforms (Kafka preferred)
-
Understanding of monitoring, logging, and observability practices
-
Git workflows and CI/CD pipelines
Core Competencies
-
Ability to write clean, maintainable, well-tested code
-
Strong debugging skills and systematic problem-solving approach
-
Comfortable working in a fast-paced environment with evolving requirements
-
Self-directed with ability to make pragmatic technical decisions
Strongly Preferred
-
Experience with trading systems, market-making, or order book dynamics
-
Knowledge of sports betting markets or financial trading ecosystems (TradFi/Crypto)
-
Experience with Protobufs, Argo Workflows, or similar tools in our stack
-
Background in high/mid-frequency or low-latency system design
-
Understanding of WebSocket protocols and real-time data streaming
Nice to Have
-
Exposure to FIX protocol or other financial messaging standards
-
Experience with Streamlit or similar tools for rapid dashboard development
-
Knowledge of database CDC patterns (Debezium, etc.)
-
Contributions to open-source trading or data infrastructure projects
-
Experience working with Rust or C++
Base salary: Starting at $140,000 base