Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
5 - 8 Lacs
mumbai
Hybrid
Job Description Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skillsci/cd,zeppelin,pycharm,etl tools,control-m,tableau,performance tuning,jenkins,qlikview,informatica,PySpark*
Posted 18 hours ago
6.0 - 8.0 years
8 - 12 Lacs
hyderabad
Hybrid
Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skillsinformatica,jupyter notebook,api integration,unix,linux,git,aws s3,hive,cloudera,jasper,airflow,hadoop,data modeling,PySpark*
Posted 18 hours ago
5.0 - 8.0 years
25 - 40 Lacs
bengaluru
Work from Office
Job Summary We are looking for a talented Data Scientist to join our team. The ideal candidate will have a strong foundation in data analysis, statistical models, and machine learning algorithms. You will work closely with the team to solve complex problems and drive business decisions using data. This role requires strategic thinking, problem-solving skills, and a passion for data. Job Responsibilities Analyse large, complex datasets to extract insights and determine appropriate techniques to use. Build predictive models, machine learning algorithms and conduct A/B tests to assess the effectiveness of models. Present information using data visualization techniques. Collaborate with different teams (e.g., product development, marketing) and stakeholders to understand business needs and devise possible solutions. Stay updated with the latest technology trends in data science. Develop and implement real-time machine learning models for various projects. Engage with clients and consultants to gather and understand project requirements and expectations. Write well-structured, detailed, and compute-efficient code in Python to facilitate data analysis and model development. Utilize IDEs such as Jupyter Notebook, Spyder, and PyCharm for coding and model development. Apply agile methodology in project execution, participating in sprints, stand-ups, and retrospectives to enhance team collaboration and efficiency. Education IC - Typically requires a minimum of 5 years of related experience.Mgr & Exec - Typically requires a minimum of 3 years of related experience.
Posted 18 hours ago
6.0 - 8.0 years
0 Lacs
chennai, tamil nadu, india
On-site
The ideal candidate will use their passion for big data and analytics to provide insights to the business covering a range of topics. They will be responsible for conducting both recurring and ad hoc analysis for business users. Key Requirements: Contribute across the GenAI and analytics lifecycle, including implementation, testing, validation, documentation, monitoring, and reporting. Identify process improvement opportunities and design agentic applications for analytics and other teams. Work with LLMOps and MLOps frameworks to deploy and monitor AI/ML models in real time. Collaborate with cross-functional teams - Data Supply, Product, Governance, PMO, and Technology to deliver analytical solutions. Key skills: 6-8 years of experience in Data Analytics or Data Science roles. Proficient in python and familiar with relevant packages and tools. Hands-on experience with GenAI/LLM frameworks and tools like Langchain, WatsonxAI etc. Comfortable with development tools such as VSCode, PyCharm, Git, and JIRA. Solid understanding of statistical and machine learning techniques. Experience with visualization tools like Power BI. Strong verbal, written, and presentation skills.
Posted 18 hours ago
6.0 - 10.0 years
4 - 8 Lacs
bengaluru
Hybrid
Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key SkillsApache Spark,Python,unix,linux,performance tuning,agile methodologies,hadoop,etl,PySpark*
Posted 19 hours ago
6.0 - 11.0 years
8 - 12 Lacs
chennai
Hybrid
Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci cd , zeppelin , pycharm , pyspark , etl tools,control-m,unit test cases,tableau,performance tuning , jenkins , qlikview , informatica , jupyter notebook,api integration,unix/linux,git,aws s3 , hive , cloudera , jasper , airflow , cdc , pyspark , apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skills ci cd,zeppelin,pycharm,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix,linux,PySpark*
Posted 20 hours ago
2.0 - 5.0 years
4 - 8 Lacs
kolkata
Work from Office
Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skills ci/cd,zeppelin,pycharm,etl,control-m,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix,PySpark*
Posted 20 hours ago
3.0 - 5.0 years
0 Lacs
mumbai, maharashtra
On-site
Capital MarketsMumbai Posted On 15 Sep 2025 End Date 14 Nov 2025 Required Experience 3 - 5 Years Basic Section No. Of Openings 1 Designation Senior Developer Closing Date 14 Nov 2025 Organisational MainBU EQPM Sub BU Capital Markets ParentCC COGS CostCenter COGS Legal Entity QualityKiosk Technologies Private Limited Legal Entity Location Navi Mumbai Country India Region India State Maharashtra City Mumbai Working Location Mahape Client Location NA Skills Skill MANUAL TESTING Highest Education No data available CERTIFICATION No data available Working Language No data available JOB DESCRIPTION • Experience with creating, deploying, configuring applications using Python • Expert knowledge in REST API design and implementation in FLASK • Experience with automated provisioning of Dev/Test/Staging and Production environments using Azure DevOps CI/CD pipeline • Experience with version control systems like GIT, SVN etc. (Branching, tagging, Push) • Experience using Pycharm or Visual Studio Code • Experience in writing Unit test, Integration test and REST API testing using Postman, SOAP UI • Ability to prepare detailed build/test plans to implement new technologies • Ability to break down application requirements and propose appropriate architectural solutions. Key Responsibilities: • Designing, building and deploying On-prem application to Azure Cloud • Responsible for scripting and programming to deploy and operate our production systems • Updating incident cases in the IT Service Management system • Documentation and reporting • Participate in on-call rotation, drive incident resolution, live troubleshooting and impact mitigation. • Evaluate/implement/maintain software build processes & automation tools to support software product development. • Lead and Maintain discipline around build and release operations, best-practices, and protocols across the entire development team- Run and Maintain different Product environments (dev, staging, sandbox, production). • Help architect, build, and deploy secure infrastructure in support of Dispatch Dev teams including standards for Docker environments, load balancers, and Kubernetes Cluster. • Monitor ticketing system for reported issues and assist development groups in the timely resolution of opened tickets. - Standardize and document development and deployment operations and methods. • Experience with creating, deploying, configuring applications using Python • Expert knowledge in REST API design and implementation in FLASK • Experience with automated provisioning of Dev/Test/Staging and Production environments usin
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
mumbai, maharashtra
On-site
MS - Capital MarketsMumbai Posted On 15 Sep 2025 End Date 14 Nov 2025 Required Experience 3 - 5 Years Basic Section No. Of Openings 1 Designation Senior Test Engineer Closing Date 14 Nov 2025 Organisational MainBU Quality Engineering Sub BU MS - Capital Markets Country India Region India State Maharashtra City Mumbai Working Location Mahape Client Location NA Skills Skill MANUAL TESTING Highest Education No data available CERTIFICATION No data available Working Language No data available JOB DESCRIPTION • Experience with creating, deploying, configuring applications using Python • Expert knowledge in REST API design and implementation in FLASK • Experience with automated provisioning of Dev/Test/Staging and Production environments using Azure DevOps CI/CD pipeline • Experience with version control systems like GIT, SVN etc. (Branching, tagging, Push) • Experience using Pycharm or Visual Studio Code • Experience in writing Unit test, Integration test and REST API testing using Postman, SOAP UI • Ability to prepare detailed build/test plans to implement new technologies • Ability to break down application requirements and propose appropriate architectural solutions. Key Responsibilities: • Designing, building and deploying On-prem application to Azure Cloud • Responsible for scripting and programming to deploy and operate our production systems • Updating incident cases in the IT Service Management system • Documentation and reporting • Participate in on-call rotation, drive incident resolution, live troubleshooting and impact mitigation. • Evaluate/implement/maintain software build processes & automation tools to support software product development. • Lead and Maintain discipline around build and release operations, best-practices, and protocols across the entire development team- Run and Maintain different Product environments (dev, staging, sandbox, production). • Help architect, build, and deploy secure infrastructure in support of Dispatch Dev teams including standards for Docker environments, load balancers, and Kubernetes Cluster. • Monitor ticketing system for reported issues and assist development groups in the timely resolution of opened tickets. - Standardize and document development and deployment operations and methods. • Experience with creating, deploying, configuring applications using Python • Expert knowledge in REST API design and implementation in FLASK • Experience with automated provisioning of Dev/Test/Staging and Production environments usin
Posted 1 day ago
4.0 - 10.0 years
0 Lacs
maharashtra
On-site
As a Senior Azure Cloud Engineer at Pristine Retail Solutions, you will play a crucial role in developing and migrating end-to-end solutions that empower our customers to succeed in their markets. Your primary responsibilities will include collaborating with engineering teams, evaluating optimal cloud solutions, migrating applications to Azure cloud platform, modifying existing systems, and educating teams on new cloud technologies. You will also design, develop, and deploy modular cloud-based systems, ensuring efficient data storage and processing functions in accordance with best practices in cloud security. **Key Responsibilities:** - Collaborate with engineering teams to evaluate and identify optimal cloud solutions. - Migrate existing applications to Azure cloud platform. - Modify and improve existing systems. - Educate teams on new cloud technologies and initiatives. - Design, develop, and deploy modular cloud-based systems. - Develop and maintain cloud solutions in accordance with best practices. - Ensure efficient functioning of data storage and processing functions in line with company security policies. - Identify, analyze, and resolve infrastructure vulnerabilities and application deployment issues. - Review business requirements and propose solutions for improvement. - Interact with clients, provide cloud support, and make recommendations based on client needs. **Technical Expertise and Familiarity:** - Cloud Technologies: Azure, AWS, VMWare, Google Cloud, Oracle Cloud. - Real-time Streaming: Apache Kafka. - Java: Core Java, Java EE/J2EE, JSP, Servlets, JDBC, JMS, AJAX, JSON, XML, Apache AXIS, JUNIT. - Microsoft: .NET Core, ASP.NET, C#, NUNIT. - Frameworks/Technologies: Spring, Spring Boot, Spring Batch, Hibernate/MyBatis, JPA, MVC, Microservices, Web Services, REST API, Java Script, JQuery, CSS, ReactJS/AngularJS, Testing Frameworks, CI/CD Pipeline, Jenkins, Docker. - App Servers: JBoss/Weblogic/WebSphere/Tomcat. Web Servers: Apache/IIS. - IDEs: Eclipse, STS, Android Studio, Visual Studio, RStudio, PyCharm. - Databases: Oracle, MySQL, NoSQL Databases (e.g., MongoDB), Azure database, Big Data. - O/S: Linux, Unix, Windows. - Experience in Big Data technologies, and Advanced Coursework in AI and Machine Learning is a plus. - Azure, AWS, and GCP certifications are preferred. In addition to technical skills, the following soft skills and abilities are crucial for success in this role: **Soft Skills:** - Adaptability - Communication - Teamwork - Time Management - Critical Thinking **Abilities:** - Architectural view - ability to see the big picture and relationships. - Ability to provide pragmatic solutions considering current realities. - Excellent written and verbal communication skills. - Strong presentation capabilities to technical and non-technical audiences. - Adherence to standards and best practices. - Ability to gain team respect and create consensus through knowledge sharing and empathy towards stakeholders. **Key Experiences:** - Data center migration to Cloud infrastructure. - Migrating applications in .Net, Java, R, and Python to Azure Cloud platform. - Experience in Real-time Streaming technologies. - Experience with CI/CD pipeline, Jenkins. - Complete solution life cycle participation. - Developing enterprise solutions using REST APIs and Microservices. - Deep knowledge of Design and Architectural. Joining Pristine Retail Solutions will provide you with a start-up environment where you can shape the future of the industry. You will work with talented Retail Business Practitioners, Consumer Behavior Analysts, AI Scientists, and Engineers. Additionally, you will have opportunities for personal growth with annual study time, competitive salaries, and generous equity. Apply now by sending your resume to career@pristineinfotech.com and describe why you are supremely qualified for this role.,
Posted 2 days ago
4.0 years
0 Lacs
bengaluru, karnataka, india
On-site
About This Role Wells Fargo is seeking a Technology Business Systems Manager. In This Role, You Will Manage and develop team of business systems consulting staff or technology systems analysts in providing technical solutions with moderate complexity and risk in the area technology business systems, and engage business and technical groups associated with the function Identify opportunities for process improvement and risk factors in key areas of technology risk including security, stability, and scalability Make decisions and resolve issues regarding technology or related policies that may impact multiple lines of business Interpret business requirements and provide in depth technical expertise and consultation to business and systems management Collaborate and consult with business and technical groups or management to ensure effective technical solutions are provided based on business requirements Interact directly with technology systems analysts for conducting evaluations of business requirements and recommend appropriate technological alternatives Manage allocation of people and financial resources for Technology Business Systems Mentor and guide talent development of direct reports and assist in hiring talent Required Qualifications: 4+ years of Technology Business Systems, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 2+ years Leadership experience Desired Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (Master's degree preferred). Experience in IT Service Management, IT Operations, and enterprise service delivery. Hands-on experience with ServiceNow ITSM platform including module customization, scripting, and workflow design. Strong experience in SQL (MySQL, PostgreSQL, SQL Server) for data querying, optimization, and analytics. Expertise in Power BI for building data models, visualizations, and integrating ITSM metrics for executive reporting. Deep understanding of ITIL framework and its practical application in ITSM and ITOM environments. Experience with ServiceNow integrations including MID Server, Integration Hub, and scripting using JavaScript. Proven ability to lead cross-functional teams and manage large-scale ITSM projects in Agile/Scrum environments. Strong analytical, problem-solving, and communication skills. ServiceNow Certified Implementation Specialist (ITSM) or Certified Application Developer. Experience with additional ServiceNow modules such as ITOM, HRSD, or CSM. Familiarity with cloud platforms (AWS, Azure) and DevOps tools (Jenkins, Git). Exposure to GRC, CMDB, and automation frameworks within ServiceNow Job Expectations: Design, configure, and customize ServiceNow ITSM modules including Incident, Problem, Change Management, and Service Catalog to align with enterprise service delivery goals. Lead end-to-end ServiceNow ITSM implementations, ensuring alignment with ITIL best practices and organizational objectives. Architect and implement integrations between ServiceNow and external systems using REST/SOAP APIs, Python scripts, Power BI connectors, and PyCharm for development and debugging. Oversee SQL database operations including querying, optimization, and data migration to support reporting and analytics within ServiceNow and BI platforms. Drive the creation of interactive dashboards and reports in Power BI by integrating ServiceNow data and other sources to provide actionable insights. Collaborate with cross-functional teams including IT operations, business analysts, and developers to gather requirements and deliver scalable ITSM solutions. Conduct performance tuning, upgrades, and code reviews for ServiceNow instances to ensure optimal performance, security, and compliance. Provide technical leadership and mentorship to team members on ServiceNow, Python, SQL, Power BI, and PyCharm best practices. Ensure governance, risk management, and compliance (GRC) alignment in all ITSM processes and implementations. Manage service delivery KPIs, SLAs, and continual improvement initiatives across ITSM and ITOM domains. Lead and manage mid to large-sized teams (10-50 members) across multiple geographies, ensuring high performance, collaboration, and accountability. Foster a culture of innovation, continuous improvement, and customer-centric service delivery within the ITSM function. Provide strategic direction, coaching, and career development support to team members, enabling growth and skill enhancement. Drive resource planning, workload distribution, and performance management to meet project timelines and service delivery goals. Facilitate effective communication across teams and stakeholders, ensuring alignment on priorities, risks, and dependencies. Champion Agile and DevOps practices to improve delivery velocity and operational efficiency. Resolve conflicts, manage escalations, and ensure team morale and motivation are maintained during high-pressure situations. Represent the ITSM team in leadership forums, steering committees, and executive reviews. Posting End Date: 19 Sep 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-489949
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Scientist at mPokket, you will be responsible for collaborating with the data science team to plan projects and build analytics models. Your strong problem-solving skills and proficiency in statistical analysis will be key in aligning our data products with our business goals. Your primary objective will be to enhance our products and business decisions through effective utilization of data. Key Responsibilities: - Oversee the data scientists" team and data specialists, providing guidance and support. - Educate, lead, and advise colleagues on innovative techniques and solutions. - Work closely with data and software engineers to implement scalable sciences and technologies company-wide. - Conceptualize, plan, and prioritize data projects in alignment with organizational objectives. - Develop and deploy analytic systems, predictive models, and explore new techniques. - Ensure that all data projects are in sync with the company's goals. Qualifications Required: - Master's degree in Computer Science, Operations Research, Econometrics, Statistics, or a related technical field. - Minimum of 2 years of experience in solving analytical problems using quantitative approaches. - Proficiency in communicating quantitative analysis results effectively. - Knowledge of relational databases, SQL, and experience in at least one scripting language (PHP, Python, Perl, etc.). - Familiarity with statistical concepts such as hypothesis testing, regressions, and experience in manipulating data sets using statistical software (e.g., R, SAS) or other methods. Additional Details: mPokket is a company that values innovation and collaboration. The team culture encourages learning and growth, providing opportunities to work on cutting-edge technologies and projects that have a real impact on the business. Thank you for considering a career at mPokket.,
Posted 3 days ago
5.0 years
0 Lacs
hyderabad, telangana, india
On-site
Responsibilities Develop, test and maintain intuitive, interactive, and modern Python-based evaluation or demonstration user interfaces (50%) Develop, test and maintain automated validation benches of embedded software developments (30%) Contribute to the continuous improvement of the development process, tools, and methodologies (10%) Generate technical documentation, presentations, and training materials to facilitate LoRa technology adoption and design-in (10%) Minimum Qualifications Bachelor’s degree in computer science, Software Engineering, or a related field 5+ years of professional Python development experience Expert-level knowledge of Python 3.x with deep understanding of object-oriented programming, design patterns, data structures, algorithms, and advanced Python concepts. Extensive experience with multiple Python GUI frameworks including PyQt, and PySide. Proven knowledge of user interface (UI) and user experience (UX) design principles and a willingness to learn more about creating intuitive and visually appealing user interfaces Proficient in tools such as Git, JIRA, Jenkins etc. Familiarity with CI/CD process, and with software packaging and distribution. Excellent communication and teamwork skills. Knowledge in web development and/or machine learning concepts is a plus. Desired Qualifications Python Expertise Demonstrated ability to write clean, maintainable, and scalable Python code following PEP 8 standards and best practices Experience in design and implementation of modular, extensible Python applications using architectural patterns, plugin architectures, configuration management, and in creating reusable Python packages Proficiency with IDEs (PyCharm, VS Code), debugging tools, and code analysis tools (pylint, flake8, black). Experience with documentation generation (Sphinx), logging frameworks, and configuration management libraries GUI Development Proficiency in creating responsive, multi-threaded GUI applications with complex layouts, custom widgets, signal/slot mechanisms, and real-time data visualization. Expertise in Python packaging (setuptools, pip, conda), virtual environments, and dependency management. Experience with creating distributable Python applications using tools like PyInstaller, cx_Freeze, or similar. Experience with GUI testing frameworks and automated UI testing methodologies Knowledge of Python package deployment strategies for customer environments Testing and Quality Assurance Comprehensive experience with Python testing frameworks including pytest, unittest, and mock Practice in test-driven development (TDD), continuous integration, and automated testing pipelines.
Posted 3 days ago
0 years
0 Lacs
pune, maharashtra, india
On-site
Job Description Lead end-to-end solutioning of AI agents and agentic workflows, including design, development, testing, and deployment. Build business cases, conduct process reviews, perform data analysis, and document business requirements. Apply prompt and context engineering techniques to optimize AI model performance. Work with cutting edge Generative AI models such as OpenAI (ChatGPT), Anthropic, Microsoft, and Meta (LLaMA). Develop and deploy AI solutions using Python, Java, and frameworks like Flask, integrating with APIs and App Engine. Implement and manage CI/CD pipelines for efficient and secure deployment. Ensure secure handling of API keys, passwords, and tokens across environments. Use Docker for containerization and GitHub for version control and collaborative development. Conduct testing of AI agents and workflows to ensure reliability, accuracy, and performance.Collaborate with cross-functional teams to identify business problems and apply AI-driven solutions. Mentor Operations staff on AI solutioning, prompt design, and best practices.Drive AI adoption across the Operations Practice through enablement sessions, bootcamps, and strategic initiatives. Execute rapid response AI projects with a focus on delivery and measurable business impact.Act as a subject matter expert and internal champion for AI capabilities. OpenAI ChatGPT, Anthropic, Microsoft, and Meta LLaMA Agentic AI Python, Java, PyCharm, Flask Docker GitHub API integration & App Engine Ansible Terraform
Posted 3 days ago
4.0 - 9.0 years
15 - 25 Lacs
hyderabad, bengaluru
Hybrid
Role & responsibilities We are looking for a skilled Automation with expertise in Python testing using Selenium and Pytest, along with strong experience in both automation and manual functional testing. Key Responsibilities: Automation Testing: Develop, execute, and maintain automation test scripts using Python and Selenium. Design and implement frameworks for automated testing using Pytest. Integrate automated tests into CI/CD pipelines to ensure continuous testing during development cycles. Identify and report issues, bugs, and defects during automation testing and collaborate with developers for resolution. Create detailed test plans, test cases, and test data based on business requirements and acceptance criteria. Execute test cases to ensure functionality, usability, and performance meet specifications. Document and report test results, defects, and issues using testing tools. Quality Assurance: Ensure comprehensive test coverage across all layers of the application (UI, API, and backend). Conduct regression testing to ensure existing functionality remains unaffected by new changes. Collaborate with developers, business analysts, and product owners to understand requirements and ensure alignment. Participate in sprint planning and reviews to provide input on testing estimates and priorities. About company Since 1993, EPAM Systems, Inc. (NYSE: EPAM) has leveraged its advanced software engineering heritage to become the foremost global digital transformation services provider leading the industry in digital and physical product development and digital platform engineering services. Through its innovative strategy; integrated advisory, consulting, and design capabilities; and unique 'Engineering DNA,' EPAM's globally deployed hybrid teams help make the future real for clients and communities around the world by powering better enterprise, education and health platforms that connect people, optimize experiences, and improve people's lives. In 2021, EPAM was added to the S&P 500 and included among the list of Forbes Global 2000 companies. Selected by Newsweek as a 2021 Most Loved Workplace, EPAM's global multi-disciplinary teams serve customers in more than 45 countries across five continents. As a recognized leader, EPAM is listed among the top 15 companies in Information Technology Services on the Fortune 1000 and ranked as the top IT services company on Fortune's 100 Fastest-Growing Companies list for the last three consecutive years. EPAM is also listed among Ad Age's top 25 World's Largest Agency Companies for three consecutive years, and Consulting Magazine named EPAM Continuum a top 20 Fastest-Growing Firm.
Posted 4 days ago
3.0 - 8.0 years
4 - 8 Lacs
bengaluru
Work from Office
About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : React.js Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Bachelors degree in computer science Software Engineering or related field Summary :Were looking for a skilled ReactJS Developer to join our front-end development team. Youll be responsible for building responsive and dynamic user interfaces using ReactJS, collaborating with designers and backend engineers to deliver seamless user experiences. Roles & Responsibilities:1.Develop new user-facing features using ReactJS2.Build reusable components and front-end libraries for future use3.Translate UI/UX designs and wireframes into high-quality code4.Optimize components for maximum performance across devices and browsers5.Implement state management using Redux or similar libraries6.Collaborate with backend developers to integrate APIs and services7.Conduct code reviews and maintain code quality8.Stay updated with React ecosystem trends and best practices. Professional & Technical Skills: 1.Strong proficiency in JavaScript, including ES6+ features2.Deep understanding of ReactJS and its core principles (JSX, components, props, state, lifecycle)3.Experience with Redux, Context API, or other state management tools4.Familiarity with RESTful APIs and asynchronous programming5.Proficiency in HTML5, CSS3, and responsive design6.Experience with tools like Webpack, Babel, and Git7.Good to know:Docker, Azure DevOps, Azure cloud basics.8.Tools:PyCharm/ VS Code, PostgreSQL, GitHub, Podman/WSL Additional Information:- The candidate should have minimum 3 years of experience in Front end development using React.- This position is based at our Bengaluru (client) office only.- A Bachelors degree in computer science, Software Engineering or related field is required. Qualification Bachelors degree in computer science Software Engineering or related field
Posted 5 days ago
0 years
0 Lacs
gurugram, haryana, india
On-site
Hands-on data automation engineer with strong Python or Java coding skills and solid SQL expertise, who can work with large data sets, understand stored procedures, and independently write data-driven automation logic. Develop and execute test cases with a focus on Fixed Income trading workflows. The requirement goes beyond automation tools and aligns better with a junior developer or data automation role. Desired Skills and experience :- Strong programming experience in Python (preferred) or Java. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Hands-on experience with SQL, including: Writing and debugging complex queries (joins, aggregations, filtering, etc.) Understanding stored procedures and using them in automation Experience working with data structures, large tables and datasets Comfort with data manipulation, validation, and building comparison scripts Nice to have: Familiarity with PyCharm, VS Code, or IntelliJ for development and understanding of how automation integrates into CI/CD pipelines Prior exposure to financial data or post-trade systems (a bonus) Excellent communication skills, both written and verbal Experience of working with test management tools (e.g., X-Ray/JIRA). Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key Responsibilities :- Write custom data validation scripts based on provided regression test cases Read, understand, and translate stored procedure logic into test automation Compare datasets across environments and generate diffs Collaborate with team members and follow structured automation practices Contribute to building and maintaining a central automation script repository Establish and implement comprehensive QA strategies and test plans from scratch. Develop and execute test cases with a focus on Fixed Income trading workflows. Driving the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in the Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on different environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and managing client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 5 days ago
5.0 - 10.0 years
4 - 8 Lacs
chennai
Work from Office
What you'll do Develop, maintain, and enhance new data sources and tables, contributing to data engineering efforts to ensure comprehensive and efficient data architecture. Serves as the liaison between Data Engineer team and the Airport operation teams, developing new data sources and overseeing enhancements to existing database; being one of the main contact points for data requests, metadata, and statistical analysis Migrates all existing Hive Metastore tables to Unity Catalog, addressing access issues and ensuring smooth transition of jobs and tables. Collaborate with IT teams to validate package (gold level data) table outputs during the production deployment of developed notebooks Develop and implement data quality alerting systems and Tableau alerting mechanisms for dashboards, setting up notifications for various thresholds. Create and maintain standard reports and dashboards to provide insights into airport performance, helping guide stations to optimize operations and improve performance. All you'll need for success Preferred Qualifications- Education & Prior Job Experience Master's degree UG Min 5 -10 years of experience Databricks (Azur op) Good Communication Experience developing solutions on a Big Data platform utilizing tools such as Impala and Spark Advanced knowledge/experience with Azure Databricks, PySpark, (Teradata)/Databricks SQL Advanced knowledge/experience in Python along with associated development environments (e.g. JupyterHub, PyCharm, etc.) Advanced knowledge/experience in building Tableau Dashboard Clikview PowerBi Basic idea on HTML and JavaScript Immediate Joiner Skills, Licenses & Certifications Strong project management skills Proficient with MicrosoftOffice applications (MS Excel, Access and PowerPoint); advanced knowledge of Microsoft Excel Advanced aptitude in problem-solving, including the ability to logically structure an appropriate analytical framework Proficient in SharePoint, PowerApp and ability to use Graph API
Posted 6 days ago
1.0 - 4.0 years
4 - 8 Lacs
hyderabad
Work from Office
experience using analytical tools/languages, Libraries. Experience in designing and developing machine learning systems or AI products, implementing appropriate ML algorithms popular ML frameworks
Posted 6 days ago
0 years
0 Lacs
bhopal, madhya pradesh, india
On-site
Relu Consultancy is seeking a Data Extraction Engineer with expertise in Python (Selenium). In this role, you will design, implement, and maintain robust data scraping solutions that drive our projects forward. This is your chance to contribute to cutting-edge initiatives while enjoying a work schedule tailored to your needs. Job Title: Data Extraction Engineer Location: Bhopal, MP Job Type: Full-Time CTC: 5LPA Responsibility: 1. Work on web scraping or data extraction through Selenium/Scrapy or other frameworks and related libraries. 2. Working knowledge in various DBSs, message queues & web Restful APIs. 3. Design, build, and maintain high-performance, reusable, and reliable Python code. 4. Ensure the best possible performance, quality, and responsiveness of the application. 5. Identify and correct bottlenecks and fix bugs. 6. Help maintain code quality, organization, and documentation. Qualification: 1. Experience with the Python platform, and object-oriented programming. 2. Python libraries - Pandas, NumPy, Matplotlib, Beautiful Soup, Selenium, Tabula. 3. Data Base - MySQL, SQL, Mongo DB. 4. IDE - PyCharm, Spyder, and Jupiter notebook. 5. Communication skills - Python developers need strong verbal communication skills to work with other members of the programming team and participate in a collaborative environment. 6. Analytical ability - because Python developers analyze programs to improve their functionality, these professionals have strong analytical skills and critical thinking abilities. Why Join Us: Opportunity to work with a reputable consultancy firm. Flexible schedule. Competitive hourly rate. Collaborative and supportive team environment. Opportunity for professional growth. If you're a Data Extraction Engineer seeking a Full-time role that offers the chance to work with a dynamic consultancy firm, we'd love to hear from you. Join us in contributing to our organization's success and in making a positive impact on our team and clients.
Posted 6 days ago
2.0 years
0 Lacs
mumbai, maharashtra, india
On-site
About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialized businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporate and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function ITG is a group function established recently in ISPL since 2019 with presence in Mumbai, Chennai. We collaborate with various business lines of the group to provide IT Services. Job Title QA Engineer Date 11-Aug-2025 Department ITG - IT Transvesal Location: Mumbai / Thane Business Line / Function Compliance IT Reports To (Direct) ISPL CPL IT Manager Grade (if applicable) (Functional) Number Of Direct Reports 1 Directorship / Registration NA Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the team’s goal. In the context of a strategic transformation of the Compliance Data for BNPP, the QA Engineer will help to validate the business requirements and automate the same. Align with the local team lead, the QA Engineer will be responsible to test all user story on their backlog with the good level of quality and increase automation coverage for the application. Responsibilities Direct Responsibilities Requirement analysis of application under test Validation of the assigned user stories Ensure quality of testing Ensure a good report of advancement to the team lead. Ability to drive the deliverables for self and the team when needed. Automate E2E workflows of the application Ensure to increase automation and penetration coverage Contributing Responsibilities Ensure a good level of commitments to avoid global schedule shift due to dependencies Technical & Behavioral Competencies Expert in Automation using Selenium Cucumber BDD or robot framework Expert in designing the automation framework Expert in writing automated scripts Experience in DevOps Experience in SQL queries or MongoDB Good to have experience in API testing Experience in Functional and end to end testing Experience in Agile & Scrum Specific Qualifications (if Required) Selenium Cucumber BDD or Robot framework , DevOps, Intellij, Gitlab (Pipeline CI/CD), Python, PyCharm, JIRA, ALM Octane Skills Referential Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Attention to detail / rigor Organizational skills Communication skills - oral & written Transversal Skills: (Please select up to 5 skills) Ability to understand, explain and support change Ability to manage a project Choose an item. Choose an item. Choose an item. Education Level Choose an item. Experience Level At least 2 years
Posted 6 days ago
8.0 - 12.0 years
18 - 25 Lacs
gurugram
Hybrid
Job Summary: Seasoned IT software delivery professional with an experience of 8 - 12 years of hands on development. The candidate will specialize in ETL testing, Informatica, Python, Oracle PL/SQL, performance tuning, and data warehousing, and should bring strong technical proficiency and agile delivery experience. Responsibilities: Perform ETL testing for complex data workflows. Design, maintain, and optimize data pipelines using Informatica and Oracle PL/SQL. Implement performance tuning for databases and jobs. Write/execute Python scripts for automation and validation. Design and manage data warehousing solutions. Utilize Pytest, Unittest, AWS, and Co-Pilot. Collaborate on system design, architecture, and agile sprints. Use development tools (Pycharm/VSCode). Strong experience in Informatica, Oracle PL/SQL, performance tuning, data warehousing. Agile/continuous delivery experience. Good to Have: Control-M job scheduler experience. Oracle Security understanding. Tableau and other reporting tools experience. Additional scripting/cloud technology exposure.
Posted 6 days ago
0 years
0 Lacs
mumbai, maharashtra, india
On-site
Position Overview In Scope of Position based Promotions (INTERNAL only) Job Title: Valuations Analyst Corporate Title: NCT Location: Mumbai, India Role Description About MVRM-Valuations Control Valuations Control (VRAC) is a specialist group within ‘Market Valuations & Risk Management’ (MVRM) department under the Risk umbrella. With over 300+ professionals, the function has a Global presence across Mumbai, London, New York and Singapore that operate as business/asset class aligned organisational matrix supported by central functions to provision all official valuation results to key stakeholders across MVRM and Management Board. MRC/PTV Team Model Risk Control & Product Tagging Validation team within VRAC in Mumbai ensures trades are booked on approved Product-Model combination and validates independently appropriateness of Product Name (“Tag”) assigned to trades for all asset classes globally. The product name is used to determine model appropriateness and classify trades for various reporting processes (such as Regulatory Reporting, trader mandates, etc.). This control is of high focus area for the regulators (BaFin, ECB, and the Fed). In addition to tag validation, the team is also responsible for calculating FV Reserves due to Model limitation/deficiency and provide transparency on IFRS lvelling There are Regulatory commitments associated with the work performed by the team, so team visibility is high and tight deadlines are common. In addition to tag validation, the team is responsible for assisting the Front Office with the remediation of tagging exceptions and the approval of new products. The ultimate goal is to establish an efficient, accurate, up-front control over the tagging of trades such that error detection and subsequent remediation are not required. Therefore, there is a substantial amount of project work in addition to a business as usual process. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As a product specialist you will be responsible for: Analysing products and trade booking to determine the logic required to automatically determine the product type and features. Candidate should have good understanding on vanilla & exotic products like Bonds, IR Swaps, Swaption, Inflation Swaps, Inflation Options, CDS, FX Options, Equity Derivatives, Loans etc. Analyze and explain the outputs and publish the results to a broader audience. Working with Trading and Strats to remediate product tagging and definition issues to improve the Model Risk Control environment. Owning the accuracy of Management Information for a certain population of trades that get reported to the Model Risk Council. Building relationships with stakeholders (e.g. front office) through regular interaction and co-operation, but always acting in the best interests of the Bank Managing operational risk by ensuring processes are documented and staff are cross-trained. Developing your technical expertise to ensure you have the knowledge to face-off against technical experts in divisions outside of Business Finance. Producing presentations and communicating progress to Auditors and Regulators. Experience of hands-on development, ideally in Python or C++ and a desire to continue doing this on a day-to-day basis People Management The behaviours provided below should be adopted by all Deutsche Bank employees in relation to their development and management of others. Supports the development of an environment where people management and development is the number one priority. Coaches direct reports and others in the organisation, as appropriate Actively supports the business strategy, plans and values, contributing to the achievement of a high performance culture Takes ownership for own career management, seeking opportunities for continuous development of personal capability and improved performance contribution Acts as a role model for new employees, providing help and support to facilitate early integration and assimilation of their new environment Supports tough people decisions to ensure people performance is aligned with organisation imperatives and needs. Addresses individual performance issues, where necessary, to drive for high performance Experience/ Exposure Your skills and experience Previous experience working with banking products and understanding how they’re booked Pricing and modeling of derivative products Knowledge of front-to-back architecture of Investment Banks Strong understanding of financial markets, products, derivative pricing Excellent communication skills – ability to articulate technical and financial topics with global stakeholders Programming experience in Python, C++ is a plus . Knowledge of PyCharm, Git, BitBucket, Tableau is an advantage. the candidate should have implemented / completed AI related projects. Education/ Qualifications/Character Degree – 2.1 or above (or equivalent) ACA, CIMA, CFA, Relevant Masters Degree Control focused, deadline orientated, team player with high attention to detail How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 week ago
2.0 - 3.0 years
3 - 5 Lacs
navi mumbai
On-site
Responsibilities: Design and execute automated test scripts for Android, iOS, Mobile, and Apple TV apps. Perform functional, regression, and API testing. Manage defects using Jira with clear documentation. Collaborate with cross-functional teams to identify scope, troubleshoot, and resolve issues. Support CI/CD pipeline integration and test automation in Jenkins. Requirements: 2–3 years of experience in automation testing. Strong knowledge of Android, iOS, Mobile & Apple TV apps testing. Hands-on expertise in Robot Framework, PyCharm, Jenkins, and Jira . Experience in API automation and regression testing. Good understanding of defect management, test planning, and reporting. Strong analytical, problem-solving, and communication skills. Job Type: Full-time Pay: ₹300,000.00 - ₹500,000.00 per year Benefits: Provident Fund Work Location: In person
Posted 1 week ago
2.0 - 4.0 years
7 - 8 Lacs
gurgaon
On-site
Locations: Bengaluru | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG X A&A team, you will work closely with consulting teams on a diverse range of advanced analytics topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytics subject matter expertise, and accelerated execution support. You will collaborate with case teams to gather requirements, specify, design, develop, deliver and support analytic solutions serving client needs. You will provide technical support through deeper understanding of relevant data analytics solutions and processes to build high quality and efficient analytic solutions. YOU'RE GOOD AT Working with case (and proposal) teams Acquiring deep expertise in at least one analytics topic & understanding of all analytics capabilities Defining and explaining expected analytics outcome; defining approach selection Delivering original analysis and insights to BCG teams, typically owning all or part of an analytics module and integrating with case teams Establishing credibility by thought partnering with case teams on analytics topics; drawing conclusions on a range of external and internal issues related to their module Communicating analytical insights through sophisticated synthesis and packaging of results (including PowerPoint presentation, Documents, dashboard and charts) with consultants, collects, synthesizes, analyses case team learning & inputs into new best practices and methodologies Build collateral of documents for enhancing core capabilities and supporting reference for internal documents; sanitizing confidential documents and maintaining a repository Able to lead workstreams and modules independently or with minimal supervision Ability to support business development activities (proposals, vignettes etc.) and build sales collateral to generate leads Team requirements: Guides juniors on analytical methodologies and platforms, and helps in quality checks Contributes to team's content & IP development Imparts technical trainings to team members and consulting cohort Technical Skills: Strong proficiency in statistics (concepts & methodologies like hypothesis testing, sampling, etc.) and its application & interpretation Hands-on data mining and predictive modeling experience (Linear Regression, Clustering (K-means, DBSCAN, etc.), Classification (Logistic regression, Decision trees/Random Forest/Boosted Trees), Timeseries (SARIMAX/Prophet)etc. Strong experience in at least one of the prominent cloud providers (Azure, AWS, GCP) and working knowledge of auto ML solutions (Sage Maker, Azure ML etc.) At least one tool in each category; Programming language - Python (Must have), (R Or SAS OR PySpark), SQL (Must have) Data Visualization (Tableau, QlikView, Power BI, Streamlit) , Data management (using Alteryx, MS Access, or any RDBMS) ML Deployment tools (Airflow, MLflow Luigi, Docker etc.) Big data technologies ( Hadoop ecosystem, Spark) Data warehouse solutions (Teradata, Azure SQL DW/Synapse, Redshift, BigQuery etc,) Version Control (Git/Github/Git Lab) MS Office (Excel, PowerPoint, Word) Coding IDE (VS Code/PyCharm) GenAI tools (OpenAI, Google PaLM/BERT, Hugging Face, etc.) Functional Skills: Expertise in building analytical solutions and delivering tangible business value for clients (similar to the use cases below) Price optimization, promotion effectiveness, Product assortment optimization and sales force effectiveness, Personalization/Loyalty programs, Labor Optimization CLM and revenue enhancement (segmentation, cross-sell/up-sell, next product to buy, offer recommendation, loyalty, LTV maximization and churn prevention) Communicating with confidence and ease: You will be a clear and confident communicator, able to deliver messages in a concise manner with strong and effective written and verbal communication. What You'll Bring Bachelor/Master's degree in a field linked to business analytics, statistics or economics, operations research, applied mathematics, computer science, engineering, or related field required; advanced degree preferred At least 2-4 years of relevant industry work experience providing analytics solutions in a commercial setting Prior work experience in a global organization, preferably in a professional services organization in data analytics role Demonstrated depth in one or more industries not limited to but including Retail, CPG, Healthcare, Telco etc Prior work experience in a global organization, preferably in a professional services organization in data analytics role to join our ranks. Who You'll Work With Our data analytics and artificial intelligence professionals mix deep domain expertise with advanced analytical methods and techniques to develop innovative solutions that help our clients tackle their most pressing issues. We design algorithms and build complex models out of large amounts of data. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |