Required Qualifications/Skills/Experience: 7+ years of experience in software development with at least 3 years of experience in Java, Spark, and Big Data frameworks Strong proficiency in Python and Java Spark with knowledge of core Spark concepts (RDDs, Dataframes, Spark Streaming, etc.) Experience working in financial markets, risk management, and financial instruments Familiarity with market risk concepts, including VaR, Greeks, scenario analysis, and stress testing Hands-on experience with Hadoop, Spark Proficiency in Git, Jenkins, and CI/CD pipelines Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environmentOverview:Seeking a Java Spark Developer with expertise in big data processing, Core Java, and Apache Spark, particularly within the finance domain. This role involves developing and optimizing data pipelines for risk calculations, trade analytics, and regulatory reportingJob Duties: Develop and optimize scalable Java Spark-based data pipelines for processing and analyzing large-scale financial data Design and implement distributed computing solutions for risk modeling, pricing, and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for Spark performance tuning, including partitioning, caching, and memory management Design and implement distributed computing solutions for risk modeling and regulatory compliance Maintain high code quality through testing, CI/CD pipelines, and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics#CT1 **Only those lawfully authorized to work in the designated country associated with the position will be considered.**