About the Role:
Be part of the team building the next generation of compute platform at Citadel. The team is already responsible for running compute in the 100K+ core range and pushing the boundaries of scalability and elasticity. Working across most of Citadel’s technology groups you will be exposed to many different aspects of Citadel’s business and will have a very direct and immediate impact on the business.
We are a small, high profile team and move quickly to ship real production products to the entire firm. Our team members take ownership of their products managing the entire lifecycle, possibly building separate teams around them ass they scale.
- Design and build production cloud based batch and streaming data processing systems
- Provide infrastructure tooling for cloud and hybrid computing
- Provide operational support for your production systems
- Run and debug complex distributed environments.
- Evaluate new technologies by building and running PoCs
- Asses and communicate potential business impact of those technologies
- Act as a cloud architectural consultant for internal application teams
- Working with Front Office IT teams through entire project lifecycles.
- Experience designing and running large batch and streaming data processing systems, e.g. Hadoop, Spark, or batch systems.
- Experience building and running large data stores, e.g. Cassandra, MongoDB, Riak, HBase.
- Experience with DevOps practices such as configuration management and infrastructure automation, (e.g. Ansible, packer, Jenkins etc…)
- Experience building production systems on open source platforms.
- Solid knowledge of various languages, preferably Python, C++ or Java.
- Designed and deployed elastic applications to public or private clouds (AWS, GCE, Openstack).
- Solid understanding of encryption and security.
- Solid Linux systems knowledge.
Education: Undergraduate Degree in Computer Science, or related engineering or science curriculum.