Jobs
Interviews

8417 Pyspark Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

16 - 22 Lacs

Bengaluru

Work from Office

We are seeking a talented and experienced Data Scientist to join our team. The ideal candidate will have a strong background in statistics, mathematics, and computer science, along with excellent analytical and problem-solving skills. As a Data Scientist, you will be responsible for collecting, analyzing, and interpreting large datasets to identify actionable insights and drive strategic decision-making across the organization. You will work closely with cross-functional teams to understand business needs, design experiments, develop predictive models, and communicate findings to key stakeholders. Responsibilities 3+ years experience in data science and consulting (Retail industry preferred) Ability to understand the problem statement and implement data science solutions & techniques independently Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. Ability to learn and pick up a new language/tool/ platform quickly. Conceptualize, design, and deliver high quality solutions and insightful analysis. Conduct research and prototyping Data and requirements gathering Collaborate and Coordinate with different functional teams (DE and Product development) to implement models and monitor outcomes. Ability to deliver AI/ML based solutions around host of problems: Hypothesis testing, Z test, T test, ANOVA, Customer Segmentation & Targeting, Propensity Modelling, EDA, etc. Comfortable with working in Agile way of development Development & deployment of solutions that leverage data sciences & advanced analytics to drive incremental value for a retail business (RFM analysis, Mission segmentation, Price Optimization, Promo Optimization, CLTV etc.) Hands on Experience Required: Must-have Python Expert Level SQL - Expert Pyspark Intermediate Databricks Intermediate JIRA or similar tool Beginner/Intermediate Good to have: - Knowledge of any visualization tool Understanding of cloud infra and architecture Experience in Retail domain Work Mode:- Work from office - 5 days Location: Whitefield, Bangalore.

Posted 4 days ago

Apply

2.0 - 5.0 years

16 - 22 Lacs

Bengaluru

Work from Office

We are seeking a hands-on eCommerce Analytics & Insights Lead to help establish and grow our newly launched eCommerce business. This role requires a data-savvy individual with deep eCommerce knowledge who can help define, build, and monitor performance KPIs, provide actionable insights, and guide the business on data-driven decision making.We are looking for someone who can get into the data, generate insights, help build dashboards, and work closely with cross-functional teams (buying, marketing, ops, tech) to guide the growth of our online channel. Key Responsibilities: Define and set up eCommerce KPIs across the customer journey (traffic, conversion, retention, basket size, etc.). Work with marketing, merchandising, operations, and tech teams to establish data tracking and reporting needs. Help Build dashboards and reports to monitor site performance, category performance, and marketing ROI. Identify trends, opportunities, and root causes of underperformance in areas such as: *Product availability*Pricing & promotions*Checkout funnel performance*Customer retention*Channel acquisition effectiveness Help set up cohort analysis, customer segmentation, and basic funnel analytics. Partner with the eCommerce and data engineering teams to ensure data quality and availability from digital channels. Propose data-driven experiments and quick-win ideas to accelerate growth. Guide and support business teams in understanding and interpreting digital KPIs. Required Skills & Experience: 2 - 5 years of experience in eCommerce analytics, preferably in a grocery retail setup. Strong knowledge of eCommerce KPIs and analytics frameworks (traffic conversion repeat LTV). Proficient with working knowledge tools such as Google Analytics / GA4, Excel, SQL, Power BI or Tableau. Experience working with digital marketing data, CRM data, and product performance data. Ability to translate business questions into data analysis and present clear insights. Familiarity with customer journey mapping, funnel conversion, basket analysis, etc. Comfortable working in fast-paced, ambiguous, and build-from-scratch environments. Strong communication and stakeholder management skills able to work across functions. Strong in one of the programming languages SQL/ Pyspark Good to Have: Experience with eCommerce platforms (e.g., Shopify, Magento, Salesforce Commerce). Exposure to A/B testing, product recommendation analytics, or personalization logic. Knowledge of Python/R for deeper analysis (optional). Experience in setting up tracking infrastructure (e.g., GTM, event tagging). Location: Bangalore, KA (On-site)

Posted 4 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune, Gurugram, Bengaluru

Work from Office

What Youll Do Build, Refine and Use ML Engineering platforms and components. Scaling machine learning algorithms to work on massive data sets and strict SLAs. Build and orchestrate model pipelines including feature engineering, inferencing and continuous model training. Implement ML Ops including model KPI measurements, tracking, model drift & model feedback loop. Collaborate with client facing teams to understand business context at a high level and contribute in technical requirement gathering. Implement basic features aligning with technical requirements. Write production-ready code that is easily testable, understood by other developers and accounts for edge cases and errors. Ensure highest quality of deliverables by following architecture/design guidelines, coding best practices, periodic design/code reviews. Write unit tests as well as higher level tests to handle expected edge cases and errors gracefully, as well as happy paths. Uses bug tracking, code review, version control and other tools to organize and deliver work. Participate in scrum calls and agile ceremonies, and effectively communicate work progress, issues and dependencies. Consistently contribute in researching & evaluating latest architecture patterns/technologies through rapid learning, conducting proof-of-concepts and creating prototype solutions. What Youll Bring A masters or bachelors degree in Computer Science or related field from a top university. 5+ years hands-on experience in ML development. Good fundamentals of machine learning Strong programming expertise in Python, PySpark/Scala. Expertise in crafting ML Models for high performance and scalability. Experience in implementing feature engineering, inferencing pipelines, and real time model predictions. Experience in ML Ops to measure and track model performance, experience working with MLFlow Experience with Spark or other distributed computing frameworks. Experience in ML platforms like Sage maker, Kubeflow. Experience with pipeline orchestration tools such Airflow. Experience in deploying models to cloud services like AWS, Azure, GCP, Azure ML. Expertise in SQL, SQL DB's. Knowledgeable of core CS concepts such as common data structures and algorithms. Collaborate well with teams with different backgrounds / expertise / functions. Additional Skills Understanding of DevOps, CI / CD, data security, experience in designing on cloud platform; Experience in data engineering in Big Data systems

Posted 4 days ago

Apply

3.0 - 5.0 years

6 - 8 Lacs

Gurugram

Work from Office

D evelop and execute high-impact analytics solutions for large, complex, structured, and unstructured data sets (including big data) to drive impact on client business (topline). This person will lead the engagement for AI based SaaS product deployment to clients across industries. Leverage their strong Data Science, analytics and engineering skills to build Advanced analytics processes, build scalable and operational process pipelines and find data-driven insights that help our clients solve their most important business problems and bring optimizations. Associate Consultants also engage with Project Leadership team and clients to help them understand the insights, summaries, implications and make plans to act on them. What Youll Do: Deep analytics-tech expertise: Develop and implement advanced algorithms that solve complex business problems in a computationally efficient and statistically effective manner leveraging tools like PySpark, Python, SQL on Client/ZS cloud environment Execute statistical and data modelling techniques (e.g. hypothesis testing, A/B Testing setup, marketing impact analytics, statistical validity etc.) on large data sets to identify trends, figures and other relevant information with scalable and operational process implementations. Evaluating emerging datasets and technologies that may contribute to our analytical platform including good understanding of Generative AI capabilities and SaaS products. Communication, collaboration, unstructured problem solving and client engagement (in a high performing and high intensity team environment): Problem solving and Client engagement : Understand client business priorities, develop product use cases, do proforma analysis for estimating business opportunity, and deploy the use case for the clients. Collaboration: Work in a cross-functional team environment to lead the client engagement and collaborate on holistic solutions comprising of best practices from Frontend and Backend engineering, Data Science, and ML Engineering area. Storyboarding impact communication : Build effective storyboards to communicate solution impact to clients and ZS Leadership Scaling mindset: Provide a structure to client engagement, build and maintain standardized and operationalized Quality Checks on teams work and ensuring high quality client deliverables Team management : Export best practices and learnings to broader team and mentor Associates on teams What Youll Bring: Bachelor's degree in Computer Science (or Statistics) from a premier institute, and strong academic performance with analytics and quantitative coursework is required Knowledge of programming - Python (Deep Expertise), Pyspark, SQL Expertise in machine learning, regression, clustering, and classification models (preferably in a product environment) Knowledge of big data/advanced analytics concepts and algorithms (e.g. social listening, recommender systems, predictive modeling, etc.) Excellent oral and written communication skills Strong attention to detail, with a value-addition mindset Excellent critical thinking and problem-solving skills High motivation, good work ethic and maturity. 3-5 years of relevant post-collegiate work experience, preferably in industries like B2C, Product companies, in execution roles focused on Data Decision Sciences, Data Engineering, Stakeholder management and building scalable processes. Should have hands on analytics experience where the candidate has worked on the algorithms / methodology from scratch and not merely executed existing codes and processes. Ability to coach, mentor juniors on the team to drive on the job learning expertise building

Posted 4 days ago

Apply

0 years

2 - 2 Lacs

Hyderābād

On-site

Role Summary & Role Description: Technical Manager with specific Oracle, PLSQL and design, develop, and optimize data workflows on the Databricks platform. The ideal candidate will have deep expertise in Apache Spark, Pyspark, Python, job orchestration, and CI/CD integration to support scalable data engineering and analytics solutions. Analyzes, designs, develops and maintains software applications to support business units. Expected to spend 80% of the time on hands-on development, design and architecture and remaining 20% on guiding the team on technology and removing other impediments Capital Markets Projects experience preferred Provides advanced technical expertise in analyzing, designing, estimating, and developing software applications to project schedule. Oversees systems design and implementation of most complex design components. Creates project plans and deliverables and monitors task deadlines. Oversees, maintains and supports existing software applications. Provides subject matter expertise in reviewing, analyzing, and resolving complex issues. Designs and executes end to end system tests of new installations and/or software prior to release to minimize failures and impact to business and end users. Responsible for resolution, communication, and escalation of critical technical issues. Prepares user and systems documentation as needed. Identifies and recommends Industry best practices. Serves as a mentor to junior staff. Acts as a technical lead/mentor for developers in day to day and overall project areas. Ability to lead a team of agile developers. Worked in a complex deadline driven projects with minimal supervision. Ability to architect/design/develop with minimum requirements by effectively coordinating activities between business analysts, scrum leads, developers and managers. Ability to provide agile status notes on day to day project tasks. Technical Skills Design and implement robust ETL pipelines using Databricks notebooks and workflows. Proficiency in Python, Scala, Apache Spark, SQL, and Spark DataFrames. Experience with job orchestration tools and scheduling frameworks. Optimize Spark jobs for performance and cost-efficiency. Develop and manage job orchestration strategies using Databricks Jobs and Workflows. Familiarity with CI/CD practices and tools. Monitor and troubleshoot production jobs, ensuring reliability and data quality. Implement security and governance best practices including access control and encryption. Strong Practical experience using Scrum, Agile modelling and adaptive software development. Ability to understand and grasp the big picture of system components. Experience building environment and architecture and design guides and architecture and application blueprints. Strong understanding of data modeling, warehousing, and performance tuning. Excellent problem-solving and communication skills. Core/Must have skills: Oracle, SQL, PLSQL, Python, Scala, Apache Spark, Spark Streaming, CI CD pipeline, AWS cloud experience Good to have skills: Airflow Work Schedule: 12 PM IST to 9 PM (IST) About State Street: What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation, we’re making our mark on the financial services industry. For more than two centuries, we’ve been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary by location, but you may expect generous medical care, insurance and savings plans, among other perks. You’ll have access to flexible Work Programs to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees’ diverse backgrounds, experiences and perspectives are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift programs and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers

Posted 4 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Pune, Gurugram, Bengaluru

Work from Office

about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What youll do : We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What youll bring: Bachelors or masters degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance Optimization: Knowledge of techniques to optimize knowledge graph operations like data inserts. Data Modeling: Proficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the teams requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At:

Posted 4 days ago

Apply

5.0 years

15 - 18 Lacs

Hyderābād

On-site

#Connections #Hiring #SeniorDataScientist #Hyderabad #Experience Hi Connections, We are hiring senior Data Scientist for our client. Role: Sr. Data Scientist (Predictive Analytics Focus & Data bricks) Experience: 5 Years Location: Hyderabad Responsibilities: Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML. Build end-to-end ML pipelines (data ingestion → feature engineering → model training → deployment) on Databricks Lakehouse. Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking. Collaborate with engineering teams to operationalize models (batch/real-time) using Databricks Jobs or REST APIs. Implement Delta Lake for scalable, ACID-compliant data workflows. Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions. Troubleshoot issues in Spark Jobs and Databricks Environment. Requirements: Experience should have 3 to 5 years in predictive analytics, with expertise in regression, classification, time-series modeling. Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark. Familiarity with MLflow, Feature Store, and Unity Catalog for governance. Industry experience in Life Insurance or P&C. Good to have certification on Databricks Certified ML Practitioner. Technical Skills: Python, PySpark, MLflow, Databricks AutoML. Predictive Modelling (Classification, Clustering, Regression, timeseries and NLP). Cloud platform (Azure/AWS), Delta Lake, Unity Catalog. Interested guys, kindly share your updated profile to pavani@sandvcapitals.com or reach us on 7995292089. Thank you. Job Type: Full-time Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Experience: Data Scientist: 4 years (Required) Work Location: In person

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Gurugram

Work from Office

Associate Consultant, Customer Success, BHCA: Customer Success Associate Consultant design, develop and execute high-impact analytics solutions for large, complex, structured, and unstructured data sets (including big data) to drive impact on client business (topline). This person will lead the engagement for AI based SaaS product deployment to clients across industries. Leverage their strong Data Science, analytics and engineering skills to build Advanced analytics processes, build scalable and operational process pipelines and find data-driven insights that help our clients solve their most important business problems and bring optimizations. Associate Consultants also engage with Project Leadership team and clients to help them understand the insights, summaries, implications and make plans to act on them. What Youll Do: Deep analytics-tech expertise: Develop and implement advanced algorithms that solve complex business problems in a computationally efficient and statistically effective manner leveraging tools like PySpark, Python, SQL on Client/ZS cloud environment Execute statistical and data modelling techniques (e.g. hypothesis testing, A/B Testing setup, marketing impact analytics, statistical validity etc.) on large data sets to identify trends, figures and other relevant information with scalable and operational process implementations. Evaluating emerging datasets and technologies that may contribute to our analytical platform including good understanding of Generative AI capabilities and SaaS products. Communication, collaboration, unstructured problem solving and client engagement (in a high performing and high intensity team environment): Problem solving and Client engagement : Understand client business priorities, develop product use cases, do proforma analysis for estimating business opportunity, and deploy the use case for the clients. Collaboration: Work in a cross-functional team environment to lead the client engagement and collaborate on holistic solutions comprising of best practices from Frontend and Backend engineering, Data Science, and ML Engineering area. Storyboarding impact communication : Build effective storyboards to communicate solution impact to clients and ZS Leadership Scaling mindset: Provide a structure to client engagement, build and maintain standardized and operationalized Quality Checks on teams work and ensuring high quality client deliverables Team management : Export best practices and learnings to broader team and mentor Associates on teams What Youll Bring: Bachelor's degree in Computer Science (or Statistics) from a premier institute, and strong academic performance with analytics and quantitative coursework is required Knowledge of programming - Python (Deep Expertise), Pyspark, SQL Expertise in machine learning, regression, clustering, and classification models (preferably in a product environment) Knowledge of big data/advanced analytics concepts and algorithms (e.g. social listening, recommender systems, predictive modeling, etc.) Excellent oral and written communication skills Strong attention to detail, with a value-addition mindset Excellent critical thinking and problem-solving skills High motivation, good work ethic and maturity. 3-5 years of relevant post-collegiate work experience, preferably in industries like B2C, Product companies, in execution roles focused on Data Decision Sciences, Data Engineering, Stakeholder management and building scalable processes. Should have hands on analytics experience where the candidate has worked on the algorithms / methodology from scratch and not merely executed existing codes and processes. Ability to coach, mentor juniors on the team to drive on the job learning expertise building

Posted 4 days ago

Apply

3.0 - 5.0 years

8 - 13 Lacs

Noida, Gurugram, Bengaluru

Work from Office

The T C A practice has experienced significant growth in demand for engineering & architecture roles from CST, driven by client needs that extend beyond traditional data & analytics architecture skills. There is an increasing emphasis on deep technical skills like such as strong expertise in Azure, Snowflake, Azure OpenAI, and Snowflake Cortex, along with a solid understanding of their respective functionalities. Individual will work on a robust pipeline of T C A-driven projects with pharma clients . This role offers significant opportunities for progression within the practice. What Youll Do Opportunity to work on high-impact projects with leading clients. Exposure to complex and technological initiatives Learning support through organization sponsored trainings & certifications Collaborative and growth-oriented team culture. Clear progression path within the practice. Opportunity work on latest technologies Successful delivery of client projects and continuous learning mindset certifications in newer areas Contribution to partner with project leads and AEEC leads to deliver complex projects & grow T C A practice. Development of expert tech solutions for client needs with p ositive feedback from clients and team members. What Youll Bring 3-5 years experience bringing technology acumen with an architectural bend to advance practice assets & client value delivery. Experience in pharma or life sciences dataFamiliarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Need strong expertise in AWS, Python, Pyspark , SQL, Azure (azure functions/ azure data factory / azure API services/ Azure DevOps) , Shell Scripting with d eep understanding of engineering principles, ability to solve complex problems, and guide the team in technical decision-making, while staying updated with industry advancements with s trong engineering skills. Desirable to have a reasonable domain experience or subject matter expertise that goes beyond typical technology skills, preferably Pharma and life sciences S hould be able to work in an offshore onshore setting, c ollaborating with multiple teams including client and ZS stakeholders across multiple time zones , exposure to managing a small team of junior developers, QC their deliverables and coach as per the need Excellent communication and client engagement skills Desirable to have few industry recognized certifications from accredited vendors like AWS, GCP, Databricks, Azure Snowflake Additional Skills: Bachelors or masters degree in computer science, Information Technology, or related field. 3 4 years of relevant industry experience

Posted 4 days ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Hyderabad, Chennai

Work from Office

Incedo is Hiring Data Engineer-GCP : Immediate to 30 days Joiners Preferred! Are you passionate about GCP Data Engineers and looking for an exciting opportunity to work on cutting-edge projects? We're looking for a GCP Data Engineer to join our team in Chennai and Hyderabad! Skills Required: Experience: 3 to 5 years Experience with GCP , Python , Airflow , Pyspark. Location - Chennai/Hyderabad (WFO) If you are interested please drop your resume at anshika.arora@incedoinc.com Walkin Drive in Hyderabad on 2nd Aug , kindly email me for getting invite and more details.

Posted 4 days ago

Apply

5.0 years

3 - 5 Lacs

Gurgaon

On-site

With 5 years of experience in Python, PySpark, and SQL, you will have the necessary skills to handle a variety of tasks. You will also have hands-on experience with AWS services, including Glue, EMR, Lambda, S3, EC2, and Redshift. Your work mode will be based out of the Virtusa office, allowing you to collaborate with a team of experts. Your main skills should include Scala, Kafka, PySpark, and AWS Native Data Services, as these are mandatory for the role. Additionally, having knowledge in Big Data will be a nice to have skill that will set you apart from other candidates. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 4 days ago

Apply

0 years

6 - 9 Lacs

Gurgaon

On-site

Responsibilities: This function covers incumbents responsible for various data activities, which include data analysis, maintenance, data quality and continuous interaction with business users to understand the requirements and convert those to the needed codes. Understanding of marketing data/Retail line of business is a plus. Day-to-day actions are focused on creating SAS codes to audit campaign data, execute campaigns ,identify deviations and analyze the correctness of the same. BAU also include reports being created and provided to business users for Retail line of business using SAS , Excel and planned migration to Tableau or equivalent approved reporting Tool. Knowledge of Autosys and Service now is an add on. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Technology Stack : Previous experience on SAS (intermediate-Expert) for creating reports /complex data sets Excel, Tableau/Equivalent reporting tool, Beginner/intermediate knowledge of: Python/Pyspark and Hadoop/Hive High attention to detail and analytical skills Logical approach to problem solving and good written and verbal communication skills - Job Family Group: Decision Management - Job Family: Data/Information Management - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Join us as a Data Engineer, PySpark, AWS Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, You'll need at least eight years of experience working with Python, PySpark and SQL You'll also need experience in AWS architecture using EMR, EC2, S3, Lambda and Glue You'll also need experience in Apache Airflow, Anaconda and Sagemaker, Youll Also Need Experience of using programming languages alongside knowledge of data and software engineering fundamentals Experience with Performance optimization and tuning Good knowledge of modern code development practices Great communication skills with the ability to proactively engage with a range of stakeholders Show

Posted 4 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Join us as a Data Engineer, PySpark, AWS Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, You'll need at least eight years of experience working with Python, PySpark and SQL You'll also need experience in AWS architecture using EMR, EC2, S3, Lambda and Glue You'll also need experience in Apache Airflow, Anaconda and Sagemaker, Youll Also Need Experience of using programming languages alongside knowledge of data and software engineering fundamentals Experience with Performance optimization and tuning Good knowledge of modern code development practices Great communication skills with the ability to proactively engage with a range of stakeholders Show

Posted 4 days ago

Apply

11.0 - 16.0 years

13 - 18 Lacs

Gurugram

Work from Office

Join our digital revolution in NatWest Digital X In everything we do, we work to one aim To make digital experiences which are effortless and secure, So we organise ourselves around three principles: engineer, protect, and operate We engineer simple solutions, we protect our customers, and we operate smarter, Our people work differently depending on their jobs and needs From hybrid working to flexible hours, we have plenty of options that help our people to thrive, This role is based in India and as such all normal working days must be carried out in India, Job Description Join us as a Principal Engineer, Python and PySpark This is an exciting and challenging opportunity to work in a collaborative, agile and forward thinking team environment With your software development background, youll be delivering software components to enable the delivery of platforms, applications and services for the bank As well as developing your technical talents, you'll have the opportunity to build project and leadership skills which will open up a range of exciting career options We're offering this role at vice president level What you'll do As a Principal Engineer, youll be driving development software and tools to accomplish project and departmental objectives by converting functional and non-functional requirements into suitable designs Youll play a leading role in planning, developing and deploying high performance robust and resilient systems for the bank, and will develop your leadership skills as you manage the technical delivery of one or more software engineering teams, Youll also gain a distinguished leadership status in the software engineering community as you lead the wider participation in internal and industry wide events, conferences and other activities, Youll also be: Designing and developing high performance and high availability applications, using proven frameworks and technologies Making sure that the banks systems follow excellent architectural and engineering principles, and are fit for purpose Monitoring the technical progress against plans while safeguarding functionality, scalability and performance, and providing progress updates to stakeholders Designing and developing reusable libraries and APIs for use across the bank Writing unit and integration tests within automated test environments to ensure code quality The skills you'll need Youll come with a background in software engineering, software or database design and architecture, as well as significant experience developing software within an SOA or microservices paradigm, You'll need at least twelve years of experience working with Python, PySpark and AWS, Youll also need: Experience of leading software development teams, introducing and executing technical strategies Knowledge of using industry recognised frameworks and development tooling Experience of test-driven development and using automated test frameworks, mocking and stubbing and unit testing tools A background in designing or implementing APIs Experience of supporting, modifying and maintaining systems and code developed by teams other than your own Show

Posted 4 days ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Senior Infrastructure Architect Would being part of a digital transformation excite you Are you passionate about infrastructure security Join our digital transformation team We operate at the heart of the digital transformation of our business Our team is responsible for the cybersecurity, architecture and data protection for our global organization We advise on the design and validation of all systems, infrastructure, technologies and data protection, Partner the best As a Senior Infrastructure Architect, you will be responsible for: Participate in the domain technical and business discussions relative to future architect direction, Assist in the analysis, design and development of a roadmap and implementation based upon a current vs future state in a cohesive architecture viewpoint, Gather and analyze data and develop architectural requirements at project level, Participate in the infrastructure architecture governance model, Support design and deployment of infrastructure solutions meeting standardization, consolidation, TCO, security, regulatory compliance and application system qualities, for different businesses, Research and evaluate emerging technology, industry and market trends to assist in project development and/or operational support activities, Coach and mentor team members Fuel your passion To be successful in this role you will: Bachelor's Degree A minimum 8 years of professional experience, Have an experience in Azure infra services and automating deployments Have an experience working in DevOps and Data bricks Have hands on experience working with database technologies, including ETL tools including Databricks Workflows using Pyspark / Python, and an ability to learn new technologies, Have strong proficiency in writing and optimizing SQL queries and working with databases, Skilled level expertise in design of computing or network or storage to meet business application system qualities Understands technical and business discussions relative to future architecture direction aligning with business goals, Understands concepts of setting and driving architecture direction, Familiar with elements of gathering architecture requirements, Understands architecture standards concepts to apply to project work, Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too In this role, we can offer the following flexible working patterns: Working remotely from home or any other work location Flexibility in your work schedule to help fit in around life! Talk to us about your desired flexible working options when you apply Working with us Our people are at the heart of what we do at Baker Hughes We know we are better when all of our people are developed, engaged and able to bring their whole authentic selves to work We invest in the health and well-being of our workforce, train and reward talent and develop leaders at all levels to bring out the best in each other, About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward making it safer, cleaner and more efficient for people and the planet, Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progressJoin us and become part of a team of people who will challenge and inspire you! Lets come together and take energy forward, Baker Hughes Company is an Equal Opportunity Employer Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law, R139742 Show

Posted 4 days ago

Apply

18.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Chief Engineer Location: Bangalore Business & Team: Business Banking Technology (BBT) we have a vision of becoming the leading business bank, powered by the next horizon of technology. We’re delivering on this by working hand-in-hand with our business colleagues to jointly solve problems with customer centricity and technical innovation, cultivating a world-class team of empowered people, and building technology solutions for the future. We put the customer at the center of everything we do and measure our performance against the groups’ external customer satisfaction measures. The Business Banking Technology team within the Technology function manages the end-to-end technology needs for Business Banking (BB) within the CBA Group. Our team is composed of engineers and technology leaders, who bring in the right mix of skills to enable this transformation. We also work very closely with our business and operations colleagues to support these services which are critical to the Australian and Global economy. Working as part of the Payments Senior Leadership team, you will be accountable for prioritizing, coordinating, and leading the execution of technical platform delivery and service management across the BB domain. Impact & Contribution: This position will be responsible to work with a team of engineers across Payments Technology and partner with stakeholders to build, design and deliver solutions. As the Chief Engineer, you will be responsible for leading a large multi-disciplinary function responsible for defining, designing and building the highest-quality re-usable framework for Payments Technology. Define the capability uplift approach for the Chapter Area and drives uplift, in partnership with the Practice and peers (where applicable) and in line with industry. As important as ensuring quality outcomes, your role is focused on continually refining implementation standards, accelerating delivery through process improvements and building a world-class team culture. Reporting Lines: Direct line reporting into the Executive Manager, Payments Technology Functional (dotted line) reporting into the General Manager, Payments Technology Roles & Responsibilities: Chief Engineer plays a pivotal role to deliver on Payments Technology purpose through quality execution and delivery. Leverage metrics and data to inform continuous improvement opportunities that increase the effectiveness of the Chapter. Implement, evolve and support consistent ways of working across the Chapter and CBA business that align to industry standards. Uplift Chapters maturity by fostering a culture of sharing, learning and problem solving across Chapter Areas. Facilitate the constructive resolution of conflicts which may arise both internally & externally to the Chapter. Participate and contribute to the Practice and/or Chapter including maintaining technical experience commensurate with their specific Chapter. Providing oversight and leadership of data projects with strategic business and customer value for Business Banking Optimizing the resource model for efficiency and scale with resources located onshore and offshore. Building world-class team culture and engagement with a strong focus on career development. Adhere to the Code of Conduct. The Code of Conduct sets the standards of behaviour, actions and decisions we expect from our people Essential Skills: An experienced senior leader who has demonstrated success achieving measurable performance improvements across large and complex businesses. Well experienced with “Everything as code” development approach and experience with related tooling such as Ansible, Terraform, Python. 18+ years’ experience preferably gained in Banking and Finance. Have hands-on experience working with the following technologies: Java, Typescript, PySpark or Python AWS Cloud and Data Platforms AI - Exposure to Ai tools in engineering, Ex: GitHub Copilot, Route Load Proven ability to design, implement, and manage CI/CD/CT pipelines using GitHub Actions Expertise with Microservices, Rest API Integration, detailed Solution Design Sound knowledge and experience working with AWS services (such as EKS, Helm charts, Lambda among others) Ability to drive and influence senior level stakeholder engagement across business and technology. Leadership experience across multi-disciplinary teams Excellent communication skills, especially in relation to verbal communications and proven presentation skills to large audience. Experienced technologist that understands data platforms and their capabilities Education Qualifications: Bachelors or Master's degree in engineering in Computer Science/Information Technology. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 09/08/2025

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Andhra Pradesh

Remote

Software Engineer Lead Analyst - HIH - Evernorth ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Software Engineer Lead Analyst Position Overview Are you looking for an opportunity to engineer modern Cloud application, Continuous Deployment/Continuous Integration, and infrastructure solutions across a breadth of core platforms and technologies and foster a community while being supported with plentiful training and mentorship opportunities? We’re looking for a Senior Software Engineering Manager who will be responsible for enriching a Cloud Practice within Core Tech Engg & Solutions (CTES) to accelerate us on Cloud journey for claim platforms by reengineering current state legacy systems. The journey requires building foundational capabilities/patterns and enabling teams from a design, technical architecture and selecting cloud services wisely. This role required proved technology leadership with strong hands-on digital experience. Responsibilities As a lead engineer, evangelize a Software Engineering culture and community within core platforms.As a lead engineer, enable Cloud Engineering teams and guild/practice across core platforms. Develop existing engineering talent for modern Cloud and DevSecOps practices/solutions.Engineer foundational modern Cloud capabilities and empower/enable teams to utilize it.Develop a CTES Cloud roadmap and track progress against roadmap.Coach CTES scrum teams, empowering them on their Cloud and DevSecOps journey.Analyze/Assess current state of core claim adjudication platforms and its bottleneck.Lead architecture/design of claim modernization.Lead the claim platform data management, migration planning, and engineering/operations.Lead the claim platform platform (real time and batch) performance, volume, and scalability from engineering, testing, and maintenance perspective.Provider innovative solutions to eliminate the complex system architecture to modern component-based cloud solutions. Qualifications Required Skills: Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong influencing/negotiation skills. Strong interpersonal/relationship management skills. Strong time and project management skills. Strong Angular, Node Js, mognodb, and Postgres DB skills required. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Required strong AWS complex implementation experiencedPositive, growth-oriented mindsetHands-on, collaborative, partnering and engineering lead mindsetRobust technical acumen on cloud/on-prem DevSecOps software development and technologies Passion for building modular, configurable Cloud and DevSecOps solutions utilizing the latest technologiesEmpathetic technical leadership and talent development Boundless intellectual curiosity to continually explore how to move better, faster and cheaper Collaborative voice for solution integrity utilizing best practices and industry standardsTechnical and marketing skills to craft and execute initiativesRobust product management skills and inspiration for DevSecOps/software engineering teams Required Experience & Education: Proven experience with architecture, design, and development of large-scale enterprise application solutions. College degree (Bachelor) in related technical/business areas or equivalent work experience. Strong healthcare domain experience especially within claim adjudicationExpert level experience in relational and NOSQL databases5 - 8 years of experience in data and digital engineering Hands on experience engineering/developing modern on-prem/public cloud DevSecOps solutions, Angular, Node Js, Python and GoWilling to learn, go above and beyond attitudes. Experience with modern and legacy development languages. Desired Experience: Exposure to AWS and PySpark and Postgrace DB / Dinamo DBHealthcare experience including Disease Management Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 4 days ago

Apply

5.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

Title: Developer (AWS Engineer) Requirements: Candidate must have 5-6 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Strong hands-on experience Proficient in Node.js and Python Seasoned developers capable of independently driving development tasks Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Good to have Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS services such as Redshift, RDS, S3, Glue ETL, Glue Data Catalog, EMR, PySpark, Python, Lake formation, Airflow, SQL scripts, etc Good to have Experience with building data analytical platform using Databricks (data pipelines), Starburst (semantic layer) on AWS cloud environment Experience with orchestration of workflows in an enterprise environment. Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Working experience with Agile Methodology Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 4 days ago

Apply

5.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

Title: Developer (AWS Engineer) Requirements: Candidate must have 5-6 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Strong hands-on experience Proficient in Node.js and Python Seasoned developers capable of independently driving development tasks Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Good to have Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS services such as Redshift, RDS, S3, Glue ETL, Glue Data Catalog, EMR, PySpark, Python, Lake formation, Airflow, SQL scripts, etc Good to have Experience with building data analytical platform using Databricks (data pipelines), Starburst (semantic layer) on AWS cloud environment Experience with orchestration of workflows in an enterprise environment. Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Working experience with Agile Methodology Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary: We are seeking a skilled and motivated Microsoft Fabric Developer (3-6yrs exp) to join our data engineering team. The ideal candidate will have hands-on experience in building and maintaining data pipelines, working with Microsoft Fabric components, and delivering scalable data solutions in cloud environments. The ideal candidate will have a strong background in data modelling, data transformation, data analytics and reporting, with expertise in utilizing cutting-edge technologies such as Power BI and scripting engines including Power Automate, Power Query, DAX and Typescript. The primary objective of the Software Developer will be to design and maintain high-quality software solutions that facilitate data analytics and reporting for our organization. Key Responsibilities: Design, develop, and maintain software applications that support the organization's data analytics and reporting requirements. Design, develop, and maintain data pipelines using Microsoft Fabric (OneLake, Lakehouse, Dataflows, Pipelines, Notebooks). Implement ETL/ELT processes using Azure Data Factory, Synapse, and Spark (PySpark, Spark SQL). Optimize data ingestion, transformation, and loading processes for performance and scalability. Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver solutions. Ensure data quality, security, and compliance with governance standards. Monitor and troubleshoot data workflows and resolve performance bottlenecks. Document technical designs, processes, and best practices. Develop comprehensive data models and transformation solutions to facilitate accurate and efficient reporting. Develop engaging and interactive dashboards and reports utilizing Power Bi. Build automation workflows utilizing Power Automate. Produce efficient, readable, and scalable code using Typescript. Collaborate closely with cross-functional teams to identify requirements, develop solutions, and ensure on-time delivery of projects. Conduct thorough unit testing, as well as timely troubleshooting and issue resolution as required. Learn new technologies including Power BI new features and Azure Stay informed on the latest developments in data analytics and reporting technologies. Key Requirements: 3 to 5 years of experience as a software developer, with a proven track record in data analytics and reporting. Expertise in data modelling, data transformation, and data analytics. Strong proficiency in utilizing technologies such as Power BI and scripting engines including Power Automate and Typescript. Good to have knowledge and experience of Azure Services Excellent problem-solving skills with keen attention to detail. Ability to work effectively as part of a collaborative, cross-functional team. Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. Proven experience utilizing Agile development methodologies. Bachelor’s or master’s degree in computer science or a related field Skills Required Working with Rest API’s, Webservices Proficient in XSLT, CSS, JavaScript, React JS, Node JS, D3 JS Hands on Experience in scripting on typescript Experience on Python or .NET Technologies would be added advantage Working experience on PL/SQL, SQL, noSQL Databases. Certifications Azure or AWS Certified associates. Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

ETL Developer looking for a Senior ETL Developer in our Enterprise Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank.The Senior ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts Snowflake CI/CD Tools (Jenkins, GitHub) python Datastage

Posted 4 days ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

looking for a Senior ETL Developer in our Enterprise Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank.The Senior ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts Snowflake CI/CD Tools (Jenkins, GitHub) python Datastage

Posted 4 days ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Hyderabad

Work from Office

What is the Director - Research Scientist AI & Optimization responsible for? The core mandate of this role is to bring innovative digital investment products and solutions to market, leveraging a patented and innovative digital WealthTech/FinTech product - Goals Optimization Engine (GOE) - built with several years of academic research in mathematical optimization, probability theory and AI techniques at its core. The mandate also extends to leveraging cutting edge AI, such as Generative AI, in addition to Reactive AI to create value within various business functions within Franklin Templeton such as Investment Solutions, Portfolio Management, Sales & Distribution, Marketing, and HR functions, among others, in a responsible and appropriate manner. The possibilities are limitless here and present a fantastic opportunity for a self-motivated and driven professional to make significant contributions to the organization and to themselves. What are the ongoing responsibilities of a Director - Research Scientist AI & Optimization? As a Principal Research Scientist - AI and Optimization, you will play a pivotal role in driving innovation, product research, and proof of concepts for our AI research and Goals Optimization Engine (GOE) product roadmap. You will be responsible for mentoring and guiding a team of highly motivated research scientists, creating intellectual property, and ensuring successful client deployments and product development. Key Responsibilities: Innovation, Product Research, Proof of Concepts, Pseudocode & Design (40%): Lead and contribute to the multi-year Goals Optimization Engine (GOE) product roadmap, conceptualizing fitment against various industry use cases, creating product variants, and designing new features and enhancements across multiple distribution lines and geographies Mentor and guide a team of research scientists to achieve common objectives Serve as the Subject Matter Expert (SME) for a specific domain within AI and/or Optimization, acting as the go-to person for all internal stakeholders Develop pseudocode and working prototypes in a Python environment, collaborating closely with Product Managers and Product Developers Create well-articulated design documents and presentations to explain research to internal and external stakeholders, including clients and partners located globally Lead industry research and evaluate partnerships with third-party vendors and specialized service providers where appropriate Maintain a thorough understanding of boundary conditions, regulatory environments, data challenges, technology integrations, algorithmic dependencies, and operational process nuances to ensure nothing slips through the cracks Stay up to date with the latest developments in the Investment Management industry, Financial Mathematics, Portfolio Construction, and Portfolio Management IP Creation, Paper Writing, and Thought Leadership (30%): Conceptualize and produce high-quality intellectual property for publication in top-tier academic and practitioner journals Peer-review the work of other research scientists and improve the outcome of their research output Create patent-worthy intellectual content, apply for patents, and win them Take responsibility for winning industry awards for exceptional product research and innovative work products Stay informed about the latest industry research and evaluate it objectively and in an unbiased manner Publish research works for conferences Client Deployment, Product Development, and Vendor Due Diligence (30%): Act as the SME in initial client deployment discussions, showcasing the rigor of research, explaining the product or solution concept, and engaging in discussions with similar individuals/teams from the client/partner side Contribute to product development by ensuring alignment with research and design Provide hands-on support where required to complete time-critical work successfully Engage with third-party vendors and potential integration partners to understand their capabilities, methodologies, and algorithms, and perform rigorous due diligence to make clear recommendations on Go/No-Go decisions What ideal qualifications, skills & experience would help someone to be successful? Education: Bachelor's and master's degree in STEM disciplines; a PhD in a relevant discipline (Optimization, Probability, Quant Finance, AI & ML, Computational Mathematics, Statistics, etc.) would be a + Relevant industry certifications Experience - Core Skills: 10+ years of applied R &D experience in research departments of reputed organizations post-Masters or PhD Track record of real innovation generating impact is essential Demonstrated ability to create intellectual content, publish papers, and obtain patents Ability to effectively bridge the gap between academia and practice, ensuring research is practical and implementable Structured thinking and exceptional mathematical skills Excellent team player with the ability to work with ambiguity and thrive in chaos Familiarity with ML/DL/DRL, NLP, especially Large Language Models, dynamic programming, and/or convex optimization Solid experience with AWS and/or Azure Familiar with Python, PySpark, SQL, Hadoop, C++ Experience - Other Soft Skills: Proven ability to take initiative and work under pressure in a changing, fast paced environment Exceptional decision-making skills, with the ability to prioritize across needs given limited resources Thrives in a startup-like environment: loves dealing with a fast pace and changing needs Ability to build relationships both inside and outside of the product organization Ability to narrate a story for a problem along with the capacity to dive into minute details Superlative communication and consensus-building skills Work Shift Timings - 2:00 PM - 11:00 PM IST

Posted 4 days ago

Apply

4.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Job Description Our NielsenIQ Technology teams are working on revamping multiple platforms, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on NielsenIQ’s data and insights to innovate and grow. As a Backend Software Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and prototyping cutting-edge technologies. Right now, our CDAR platform is concentrating on application convergence with latest backend services with Python technologies and leverage Jenkins to support things like CI/CD and integrations. Python is primarily used to extend platform features along with continuing to adopt the best-in-class cloud-native, cloud-agnostic technologies. Our team is co-located and agile, with central technology hubs in Chicago, Madrid, Toronto, Chennai and Pune Responsibilities: Strong proficiency in Python and Django REST Framework Experience with Elasticsearch integration and optimization Hands-on experience with PySpark for data processing Proficient in PostgreSQL Solid understanding of Azure Fundamentals (certification is a plus) Experience with GitHub Actions for CI/CD Proficiency in Docker, especially multi-stage builds Experience with Kubernetes for container orchestration Strong debugging and problem-solving skills Familiarity with Agile methodologies and version control systems (Git) Interacting with multiple stakeholders Unit testing, integration testing Understanding user needs and how they fit into the overall, global solution design Configuring & Implementing Application and Integration Service to support Business needs Prototyping new features and integrations aligned to business strategy by introducing innovation through technology Following source & test-driven development best practices Troubleshooting and identifying root cause analysis while resolving the issues Qualifications Must Have - Minimum of 4-7 years of experience as a Python Developer Development experience in unit and integration test cases like PyTest Intermediate level of Database (SQL) skills to develop SQL queries, function and stored procedures Good to have basic knowledge of Cloud (Azure) Good Understanding on CI/CD Pipeline i.e. Jenkin Strong knowledge of version control tools, preferably Bit bucket Basic Knowledge on Linux/Unix environment (basic commands, shell scripting, etc.) Demonstrated ability to thrive in an enterprise Agile/SCRUM environment Demonstrated ability to work as part of a Global Team Strong troubleshooting and problem-solving skills and excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Nice to have - Minimum B.Tech/ B.E degree in Computer Science, Computer Engineering or related field (4-year degree) Experience using Collaboration Technologies: Azure DevOps, TFS, Jira, Confluence Experience using Object-oriented languages Experience using Atlassian tool suite, including JIRA, Confluence, Bitbucket Experience working with testing tools and Automation test needs Motivated, high-potential performer, with demonstrated ability to influence and lead Strong communicator with excellent interpersonal skills Able to solve complex problems and successfully manage ambiguity and unexpected change Embracing best practices and feedback as a means of continuous improvement Consistently high achiever marked by perseverance, humility and a positive outlook in the face of challenges Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies