we’re looking for an experienced Data Application Engineer to join the Cato team. This role requires you to be a critical part of the back-end team, developing our cloud-based proprietary networking and security management solutions. In addition, you will be building highly scalable processes to gather and analyze statistics from multiple networks. Responsibilities: End-to-end development of company massive data infrastructures. Researching new technologies and adapting them for use in the company’s product Working closely with the product, DevOps, and security teams Requirements: 4+ years of experience with massive large-scale data systems platforms (Storm, Spark, Kafka, SQS...) and design principles (Data Modeling, Streaming vs Batch processing, Distributed Messaging...) Expertise in one or more of the following languages: Java, Python, Scala, Go Hands-on experience with design and development production of large-scale distributed systems with an emphasis on performance Familiarity with no-SQL DBs and relational DBs. We’re using technologies such as Elasticsearch, MySQL, Clickhouse, and Redis Deep understanding of Object-Oriented Programming and software engineering principles Experience with microservices, k8s - Advantage Familiar with AWS platform- Advantage Motivated fast independent learner and great at problem-solving A team player with excellent collaboration and communication skills. Bsc. in Computer Science from a known university