Quick summary
What is it about:
Design, development and maintenance of Big Data Platform for Analytics - mainly growth and optimization of Querying/Serving Backend (mainly: Druid, Memcache, MongoDB, Cassandra)
What is needed:
- Knowledge of some of these solutions: Kafka, Flink, TensorFlow, Druid, MongoDB
- Ability to code in Node.js or Python or Go or Java or Rust or other popular one
- Flexibility to switch to programming language fitting best for the challenge
How advanced it is:
We believe that this position requires two main skills:
- Having reached Senior level or at least 5 years of professional experience
- “Wizard” attitude to challenges – willingness to do things that have no solutions reachable in search engines, combined with professional responsibility to deliver software that is stable
Where is this gem:
Warsaw, Poland
Deeper summary
- Compete with Giants (Google, Amazon) in creating an AI/BI platform that is faster, cheaper and more productive
- Be a part of rapidly growing startup (50 people), that has made it to the market, and needs a genius like You to grow even better and to steal hearts of clients dying to have their problems solved
- Come to work relaxed and high-five with both your colleagues and your bosses who put their own private money into this enterprise and trust in You
- Experiment freely, have what you need purchased in minutes. Forget paralyzing corporate procedures
About Deep BI, Inc.
Deep.BI provides a data platform for several business branches. This includes:
- Fast reporting on hundreds of terabytes of data
- Creating and serving AI-powered models
- Real-time data enrichment
We built our own HA, hybrid data cloud (now > 3000 logical cores) and we're scaling it horizontally.
Deep.BI makes data collection, integration, storage, analytics and usage easy. It reduces all the complexity needed to implement big data technology and thus minimizes risk and cost.
We also experiment with conversational user interface for our analytics platform, where customers get insights provided by chatbots. Also, as a next step we work on bot-2-bot communications to automate processes (RPA - Robotic Process Automation).
We invite the best, passionate people. Let's talk and find out if there's a fit.
Responsibilities
- You will design, develop and maintain everything that is needed to glue-up Open Source solutions we use, to keep our backend’s journey towards superiority efficient.
- You will help to expand our current platform capabilities and architect new strategies and applications
- You will shape the future of what data-driven companies look like, drive processes for extracting and using that data in creative ways, and create new lines of thinking for financial success of our customers
- You’ll apply and advise teams on the state-of-the-art advanced Big Data tools and techniques in order to derive business insights, solve complex business problems and improve decisions.
Required qualifications
If you are able to do what we expect, then no other requirements are needed.
From our practice, we believe that it takes:
- BS or MS in Computer Science or equivalent experience
- 3-8 years of experience
- Knowledge of some of { Druid, Spark, Kafka, Flink, Tensorfow, Cassandra, MongoDB } solutions set
- Fluency in at least one programming language from { Python, Java, Scala, Node.js, Go, Rust }
- Strong programming skills oriented on parallel processing to be able to work well on this position.
Of course, feel free to come and prove us wrong :)
Our offer
- Market salary, different types of contract available + paid holidays (20 or 26 days)
- Work in a young startup with solid financing, among passionate and friendly people
- Private medical care
- Stock option plan
- Flexible working hours, possibility of occasional remote work
- Each member of the team has real influence on the product - state of the art big data & AI platform
- Great office location - beautiful co-work on Senatorska street