Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
1.0 - 31.0 years
0 - 0 Lacs
Mahalaxmi Nagar, Indore
Remote
About the Role: As a Telecaller Executive at Real Space Group, you'll be the driving force behind our sales success. Your mission? To connect with potential clients, spark their interest in our luxurious properties, and guide them from curiosity to commitment. You won’t just make calls—you’ll make connections that matter and deals that count. What You’ll Do: Engage potential clients with persuasive and engaging conversations. Showcase the unique features of our luxury properties, creating a compelling vision for each client. Close deals by expertly handling objections and negotiating terms that satisfy both the client and the company. Maintain a thriving pipeline of leads, following up with precision to turn prospects into happy clients.
Posted 4 days ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
JOB_POSTING-3-71378-5 Job Description Role Title: Manager, Model Risk Management (L09) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Synchrony's Risk Team is a dynamic and innovative team dedicated to provide oversight as 2nd Line of Defense. As a member of this Team, you'll play a pivotal role for high quality model validation and to ensure modeling techniques and results are consistent with the respective strategic uses, models performing as intended, and complying with related MRM policies, standards, procedures as well as regulations. This role requires expertise in supporting model validation initiatives related to quantitative analytic modeling with the Synchrony Model Governance and Validation team. If you are passionate about Model validation and Modelling techniques then Synchrony’s Risk team is the place to be. Role Summary/Purpose The Manager, Model Validation is responsible for model validation focusing on statistical, Machine Learning (ML) and other models and ensure they are meeting the related Model Risk Management policies, standards, procedures as well as regulations (SR 11-7). This role requires expertise in supporting model validation initiatives related to quantitative analytic modeling with the Synchrony Model Governance and Validation team. This is an individual contributor role. Key Responsibilities Conduct full scope model review, annual review, ongoing monitoring model performance etc. for both internally and vendor-developed models, including new and existing, statistical/ML or non-statistical models, with effective challenges to identify potentials issues Evaluate model development data quality, methodology conceptual soundness and accuracy, and conduct model performance testing including back-testing, sensitivity analysis, benchmarking, etc. and timely identify/highlight issues. Perform proper documentation within expected timeframes for effectively highlighting the findings for further review/investigation and facilitate informed discussions on key analytics. Conduct in-depth analysis of large data sets and support the review and maintenance process of relevant models and model validation documentation. Communicate technical information verbally and in writing to both technical and business team effectively. Additionally the role requires the capability to write detailed validation documents/reports for management Support in additional book of work or special projects as in when required. Required Skills/Knowledge Bachelor’s/Master's degree (or foreign equivalent) in Statistics, Mathematics, or Data Science and 2+ years' experience in model development or model validation experience in the retail section of a U.S. financial services or banking; in lieu of a Master’s degree, 4+ years’ experience in model development / model validation experience in the retail section of financial services or banking. Knowledge and experience of customer facing models including fraud acquisition, transaction fraud, credit acquisition, credit account management and marketing models. Understanding of quantitative analysis methods or approaches in relation to credit risk models. Strong programing skills with 2+ years’ hands-on and proven experience utilizing Python, Spark , SAS, SQL, Data Lake to perform statistical analysis and manage complex or large amounts of data Desired Skills/Knowledge 2+ years of proven experience in Model Risk Management or model development in the financial services industry including both analytic/modeling/quantitative experience and governance or other credit/financial discipline. Ability to apply analytical skills to solve problems creatively. Sharp focus on accuracy with extreme attention to detail and able to make recommendations as opportunities arise. Be self-motivated, act promptly and effectively when assigned tasks. Excellent written and oral communication and presentation skills. Eligibility Criteria Bachelor’s/Master's degree (or foreign equivalent) in Statistics, Mathematics, or Data Science and 2+ years' experience in model development or model validation experience in the retail section of a U.S. financial services or banking; in lieu of a Master’s degree, 4+ years’ experience in model development / model validation experience in the retail section of financial services or banking. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying. Inform your Manager or HRM before applying for any role on Workday. Ensure that your Professional Profile is updated (fields such as Education, Prior experience, Other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (Formal/Final Formal, PIP) L4 to L7 Employees who have completed 12 months in the organization and 12 months in current role and level are only eligible. L8+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L4+ employees can apply. Grade/Level: 09 Job Family Group Credit Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description This position participates in the support of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science practices for Marketing and Finance business units. This position supports the integration of data from various data sources, as well as performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to support reusable and reproducible data assets. Responsibilities Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development. Develops and delivers data engineering documentation. Gathers requirements, defines the scope, and performs the integration of data for data engineering projects. Recommends analytic reporting products/tools and supports the adoption of emerging technology. Performs data engineering maintenance and support. Provides the implementation strategy and executes backup, recovery, and technology solutions to perform analysis. Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform. Required Qualifications Codes using programming language used for statistical analysis and modeling such as Python/Java/Scala/C# Strong understanding of database systems and data warehousing solutions. Strong understanding of the data interconnections between organizations’ operational and business functions. Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance. Strong knowledge of data structures, as well as data filtering and data optimization. Strong understanding of analytic reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.) Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Azure Preferred. Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders. Bachelor’s degree in MIS, mathematics, statistics, or computer science, international equivalent, or equivalent job experience. Required Skills 3 years of experience with Databricks Other required experience includes: SSIS/SSAS, Apache Spark, Python, R and SQL, SQL Server Preferred Skills Scala, DeltaLake Unity Catalog, Azure Logic Apps, Cloud Services Platform (e.g., GCP, or AZURE, or AWS) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
MongoDB’s mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere—on premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it’s no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. We are looking for passionate technologists to join our Pre-Sales organization to ensure that our growth is grounded and guided by strong technical alignment with our platform and the needs of our customers. MongoDB Cloud Pre-Sales Solution Architects are responsible for guiding our customers and partners to design and build reliable, scalable systems using our data platform. Our team is made up of seasoned software architects and developers who take direct responsibility for customer success, including the design of their software, deployment, and operations. You'll work closely with our cloud partner leaders and play a key role in winning deals and driving the business forward. You'll be a trusted advisor to a wide range of users from startups to the world's largest enterprise IT organizations. We are looking to speak to candidates who are based in Gurugram for our hybrid working model. As an ideal candidate, you will have: 5+ years of software development experience 3+ years of pre-sales experience with enterprise software 3+ years of experience working with Cloud providers (AWS, Microsoft or Google) Working knowledge and ability to code with two or more modern scripting languages (e.g. Python, Node.js, SQL) and/or popular programming languages (e.g. C/C++, Java, C#) Experience with scalable and highly available distributed systems Excellent presentation and communication skills The ability to travel up to 50% A Bachelor’s degree or equivalent work experience As an ideal candidate, you may also have: Experience selling databases and/or deploying applications with any of the major cloud providers Experience with data modeling and programming database-backed applications Cloud provider certification (Associate or Professional) A MongoDB Certification What you do at MongoDB: You design systems, applications, and infrastructure to help drive some of the world's largest software development projects leveraging MongoDB You advise cloud providers’ presales on architectures, patterns, and strategies for making the best use of MongoDB You confidently articulate the business value of MongoDB solutions You partner with our Cloud account teams to help ensure success in accounts ranging from small startups to large enterprises You align to support partners and sales with activities such as technical discovery, demos, proof of value, presentation, sizing, and documentation of technical decision criteria; working across a number of opportunities in parallel You translate technical concepts and patterns into business benefits for management and executives You act as a liaison, gathering feedback from the field to relay back to the Product Management team You will have the opportunity to help drive demand through participation in both industry-known trade-shows as well account-based marketing events You demonstrate resilience and sound judgment in dealing with business challenges You have an ability to drive customer and partner demand within a sales territory, by being self-motivated, proactive and understanding the importance of a strong sense of urgency You proactively seek opportunities to support and mentor other pre-sales team members and share best practices You have situational awareness and react appropriately in group settings Particularly with Cloud Partners: You understand the importance of a strong sense of urgency reflected through ability to drive partner demand within the sales territory You align to drive partner Center of Excellence strategies and create demand for MongoDB solutions You have a strong understanding of partner business model, value, and needs You successfully build and maintain effective relationships with technical partners, gaining their trust and influencing their decisions You demonstrate the ability to anticipate and respond appropriately to partner objections You are seen as a “Trusted Advisor” with technical partner stakeholders You participate in C-level partner conversations You align to provide proactive responses to help partners understand the solution and identify positive business outcomes You drive strategy and implementation of solution architecture for products and services within the cloud partner ecosystem of the region You can demonstrate technical expertise of general IT and application operations through mentorship of cloud partner organizations You create strategies that map specific industry trends to account base within cloud partner organizations You create strategies that map competitive threats within cloud partner organizations What you will learn: The rapidly expanding MongoDB product suite, including: Core database server MongoDB Atlas (fully managed cloud database service) Atlas Data Federation Atlas Full-Text Search and Atlas Vector Search Charts Other Tools and connectors - Ops/Cloud Manager, Compass, Atlas SQL, Connector for Spark, Kafka Connector, etc. Market-relevant, complementary technologies at Cloud Providers Modern and popular architecture design patterns, methodologies and concerns e.g. microservices, event-driven architectures, DevOps, Serverless, security Sales techniques and related soft skills - presentations, demonstrations, whiteboarding, discovery, objection handling Exposure to a wide variety of market verticals and broad spectrum of interesting use cases To drive the personal growth and business impact of our employees, we’re committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees’ wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it’s like to work at MongoDB, and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer. Requisition ID 425386 Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste This position participates in the support of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science practices for Marketing and Finance business units. This position supports the integration of data from various data sources, as well as performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to support reusable and reproducible data assets. Responsibilities Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development. Develops and delivers data engineering documentation. Gathers requirements, defines the scope, and performs the integration of data for data engineering projects. Recommends analytic reporting products/tools and supports the adoption of emerging technology. Performs data engineering maintenance and support. Provides the implementation strategy and executes backup, recovery, and technology solutions to perform analysis. Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform. Required Qualifications Codes using programming language used for statistical analysis and modeling such as Python/Java/Scala/C# Strong understanding of database systems and data warehousing solutions. Strong understanding of the data interconnections between organizations’ operational and business functions. Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance. Strong knowledge of data structures, as well as data filtering and data optimization. Strong understanding of analytic reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.) Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Azure Preferred. Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders. Bachelor’s degree in MIS, mathematics, statistics, or computer science, international equivalent, or equivalent job experience. Required Skills 3 years of experience with Databricks Other required experience includes: SSIS/SSAS, Apache Spark, Python, R and SQL, SQL Server Preferred Skills Scala, DeltaLake Unity Catalog, Azure Logic Apps, Cloud Services Platform (e.g., GCP, or AZURE, or AWS) Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join us as a Data Engineer We’re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you’ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you’re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What you’ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You’ll also provide transformation solutions and carry out complex data extractions. We’ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You’ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You’ll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you’ll need To be successful in this role, you’ll have an understanding of data usage and dependencies with wider teams and the end customer. You’ll also have experience in SQL and NoSQL databases to support diverse data requirements We’ll expect you to have a minimum of eight years of experience in ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You’ll Also Need Experience in developing and maintaining high-quality, reusable code in Pyspark and Spark SQL Experience in development using technologies such as Spark and Kafka Great communication skills with the ability to collaborate with software engineering teams to integrate data solutions into existing applications Show more Show less
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join Barclays as a Senior Data Engineer. At Barclays, we are building the bank of tomorrow. As a Strategy and Transformation Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success. Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks. Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Hands on programming experience in python and PY-Spark. Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology. Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications. Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL. Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Step into role of a Senior Data Engineer. At Barclays, innovation isn’t encouraged, its expected. As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success. Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks. Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Hands on programming experience in python and PY-Spark. Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology. Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications. Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL. Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot Good to have skills : Apache Spark Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with business objectives, ensuring that the solutions provided are effective and efficient. Your role will require you to stay updated with industry trends and best practices to enhance the overall performance of the applications being developed. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot. - Good To Have Skills: Experience with Apache Spark. - Strong understanding of microservices architecture and RESTful APIs. - Experience with cloud platforms such as AWS or Azure. - Familiarity with containerization technologies like Docker and Kubernetes. - The candidate should have minimum 5 years of experience in Spring Boot. Additional Information: - This position is based at our Gurugram office. - A 15 years full time education is required. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Social Media Marketing Specialist Company: Skincare/ Beauty Startup Location: Chennai Position: Full-time Salary: Market standards Perks: Absolute fun. No fine print. Hey there, Social Media addicts, meme lovers, reel makers, and caption pros, this one is for you! Blende – your soon-to-be-favorite skincare brand – is looking for a Social Media Marketing Specialist who can bring the spark , sass , and strategy to our digital presence. We’re talking about the kind of person who: Thinks in trends, dreams in swipe-ups, and wakes up with new reel ideas. Can make even a toner sound like the main character (or at least engaging). Knows when to use a fire emoji vs. sparkle emoji (yes, it matters). Has a thing for storytelling, brand voice, and aesthetic feeds. Can vibe with a progressive, fun-loving, open-minded team that thrives on ideas, coffee, and good vibes. What we’re looking for: A degree in marketing / communication / media OR a really good story. Experience in handling social media for a brand OR just a proven knack for content and audience engagement. A natural trend-spotter. If your “saved” folder on Instagram is better than Pinterest, we want to talk. Someone who can take a fun, exotic beauty brand like Blende and make it the talk of the timeline. What you’ll get: A wildly creative space where no idea is “too much.” Team lunches, brainstorms that don’t feel like work, and occasional confetti moments. A brand that listens, lets you own your ideas, and respects your voice. And yes, we do pay you. As per market standards. On time. Always. If you are ready to blend with us, send your resume, portfolio or even your Instagram handle to: blendeoffice@gmail.com Or Whatsapp us at +91 7200456098 Or just DM us with “I’m your social media specialist” and we’ll take it from there! Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Job Description Job Title: Data Support Specialist Location: Remote Candidate Expectation Candidate should have 2+ years of experience in Data support. Job Description Candidate should have 2+ years of experience as a data or quality assurance analyst, ideally working with SQL, PySpark, and/or Python Should have strong attention to detail and are a methodical problem-solver Should have excellent oral and written communication skills, with the ability to interact effectively with internal teams across time zones and cultures Should strive to make tasks as efficient as possible Should be enthusiastic about making a big impact at a rapidly growing company Should have experience working with web-scraped data, transaction data, or email data, though this is not required. Skills Required RoleData Support Specialist - Remote Industry TypeIT/ Computers - Software Functional Area Required Education B E Employment TypeFull Time, Permanent Key Skills DATA SUPPORT PY SPARK PYT HO N Other Information Job CodeGO/JC/166/2025 Recruiter NameDevikala D Show more Show less
Posted 5 days ago
9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job title: Big Data Location: Bangalore/Mumbai/Pune/Chennai Candidate Specification Candidate should have 9+ Years in Big Data, JAVA with Scala or Hadoop with Scala. Job Description Design, develop, and maintain scalable big data architectures and systems. Implement data processing pipelines using technologies such as Hadoop, Spark, and Kafka. Optimize data storage and retrieval processes to ensure high performance and reliability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Perform data modeling, mining, and production processes to support business needs. Ensure data quality, governance, and security across all data systems. Stay updated with the latest trends and advancements in big data technologies. Experience with real-time data processing and stream analytics. Knowledge of advanced analytics and data visualization tools. Knowledge of DevOps practices and tools for continuous integration and deployment Experience in managing big data projects and leading technical teams. Skills Required RoleBig Data - Manager Industry TypeIT/ Computers - Software Functional Area Required Education B E Employment TypeFull Time, Permanent Key Skills BIGDATA HADOOP J A V A SCALA Other Information Job CodeGO/JC/224/2025 Recruiter NameDevikala D Show more Show less
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsibilities: Key roles and responsibilities: The DevOps Engineer will be responsible for designing, implementing, and maintaining CI/CD pipelines using Tekton, Harness, Jenkins, and uDeploy to streamline software delivery processes. This role involves managing configuration automation with Ansible, overseeing RHEL/Linux environments to optimize performance and security, and conducting static code analysis using SonarQube. The role includes knowledge on Apache Spark, Scala, Java, Java-Spark, Apache Kafka, Cloudera Ecosystem who will utilize knowledge to design process for the Olympus project, from concept to execution, including user research, wireframing, prototyping, and high-fidelity design. Collaborate with cross-functional teams including product managers, developers, and other designers to create intuitive and user-centered designs. Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 5-8 years of relevant experience Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience Technical Skillset: Java /Spark/Scala/Kafka/ Tekton, Harness, Jenkins, and uDeploy This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 5 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Grow with us About this opportunity: It will be practically impossible for human brains to understand how to run and optimize next generation of wireless networks, i.e., 5G networks with distributed edge compute, that will drive economic and social transformation for all aspects of society. Machine Learning (ML) and other Artificial Intelligence (AI) technologies will be vital for us to handle this opportunity. We are expanding the Ericsson MLOps Engineering Unit with tech savvy developers with right attitude and skillset. MLOps is an iterative process for building, deploying, operationalizing, and observing AI/ML systems. MLOps aim is to manage the end-to-end life cycle of AI/ML models through all the phases, experimentation, development, deployment, model performance monitoring and re-training and, when needed, re-design and re-architecture to keep models operating in optimal conditions in production environments. A MLOps platform provides services and components to assist and guide organizations through this iterative process. MLOps platform components are designed to overcome the challenges to develop and operate AI/ML systems at industrial scale on production environments. Role Summary As a Software Engineer , you will build and deploy MLOps Services and components, enabling AI Use Cases Development and Production deployment with focus on scaling, monitoring and performance, re-using the MLOps Platform. The MLOps Platform unit is designing, engineering, operating, and maintaining cloud native K8s based micro-service Architecture – Service and Components that delivers that aim to deliver end to end MLOps features and functionality e.g. CI/CD, data exploration notebooks (Jupyter), ML model development and deployment, workflow engines, and ML frameworks (i.e. TensorFlow) for easy consumption by Ericsson products and services. The AI MLOps Platform covers infrastructure capacity and tools for all AI/ML project and system needs across different Ericsson Products. The main approach is to integrate/extend existing private and public cloud infrastructures, and to base the toolbox components on open source software. The deployed environments are heterogeneous, so multi-cloud, hybrid-cloud, and WAN networking are also key technology areas. In this role, you are expected to be a very hands-on developer, functioning as an individual contributor, as well as work within a cross functional team that is responsible from study, design, implement, test, deliver and maintain phases of the projects/products. Key Responsibilities Develop/integrate/automate a core AI/ML software environment, in close collaboration with data scientist and product developers Operationalize and extend open source software components, covering the entire ML model life-cycle, including e.g. data transformation, model development, deployment, monitoring, re-training, security. Collaborate with product development teams and partners in Ericsson Businesses to industrialize a platform for machine learning models and solutions as part of Ericsson offerings including providing code, workflows and documents Work with MLOps projects and development teams to identify needs and requirements for AI/ML tools and infrastructure resources. Evaluate and plan capacity of CPU, GPU, memory, storage, and networking resources to balance cost versus desired productivity and performance Develop essential automation scripts and tooling to help quality assurance, maintenance, migration, and cost-control of infrastructure deployments. Manage communication, planning, collaboration and feedback loops with business stakeholders. Model the business problem statement into AI/ML problem. Contribute to IPR creation for Ericsson in AI/ML Lead functional and technical analysis within Ericsson businesses and for strategic customers to understand MI-driven business needs and opportunities Lead studies and creative usage of new and/or existing data sources. Work with Data Architects to leverage existing data models and build new ones as needed. Provide MI Competence build-up in Ericsson Businesses and Customer Serving Units Develop new and apply/extend existing, concepts, methodologies, techniques for cross functional initiatives Key Qualifications Bachelors/Masters in Computer Science, Electrical Engineering or related disciplines from any of the reputed institutes. First Class, preferably with Distinction. Applied experience: 2+ years of experience with infrastructure, platforms, networking, and software systems; and an overall industry experience of about 4+ years. Strong software engineering experience with one or more of Golang, Java, Scala, Python, JavaScript, using container-based development practices Experience with data analytics and AI/ML systems, for example, Spark, Jupyter, Tensorflow Experience with large scale systems, for example reliability/HA, deployment, operations, testing, and trouble-shooting. Experience with delivering software products, for example release management, documentation Experience with usage/integration of public cloud services, for example, identity and access management, key management, storage systems, CPU/GPU, private/virtual networking, and Kubernetes services. Experience with modern distributed systems and tooling, for example, Prometheus, Terraform, Kubernetes, Helm, Vault, CI/CD systems. Experience with WAN networking solutions, redundancy/fail-over, QoS, and VPN technologies. Experience with Infrastructure-as-code and SRE ways-of-working Strong system administration skills, Linux and Windows Awareness of ITIL/ITSM methodologies for operations and service delivery Soft Skills Good communication skills in written and spoken English Great Team worker and collaborator Creativity and ability to formulate problems and solve them independently Self-driven and ability to work through abstraction Ability to build and nurture internal and external communities Ability to work independently with high energy, enthusiasm and persistence Experience in partnering and collaborative co-creation, i.e., working with complex multiple stakeholder business units, global customers, technology and other ecosystem partners in a multi-culture, global matrix organization with sensitivity and persistence. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Chennai Req ID: 766515 Show more Show less
Posted 5 days ago
12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description The role of Sr. Analytics Consultant / Lead exists within the Analytics offshore and onshore teams to develop innovative Analytics solutions using Data Visualization / BI solutions, Gen AI / ML, Data Analytics and Data Science, that will generate tangible value for business stakeholders and customers. The role has a focus on using sophisticated data analysis and modelling techniques, alongside cognitive computing and artificial intelligence, with business acumen to discover insights that will guide business decisions and uncover business opportunities. Key AccountabilitiesManaging large Advanced Analytics teams owning client delivery and team growth accountabilities using consulting mind set. Consulting, transformation, building proposal & competencyExperience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Interpret big data to identify and analyse emerging trends and produce appropriate business insights which monitors the performance of the Portfolios and to continuously drive an improvement in business results.Develop advance business performance & Data Analytics Tools to assist the Senior Specialists, Portfolio Managers, business stakeholders (including but not limited to Portfolio, Customer & Marketing functions) & wider Commercial team members to make required data-based recommendations, implement and monitor them accurately.Develop statistical models to predict business performance and customer behaviour. Research customer behaviours and attitudes leading to in depth knowledge and understanding of differences in customer level profitability.Promote innovation through improving current processes and developing new analytical methods and factors. Identify, investigate and introduce additional rating factors with the objective of improving product risk and location selection to maximise profit.Provide consulting advice and solution to solve the Business clients hardest pain points and realise biggest business opportunities through advanced use of data and algorithms. Can work on projects across functions based on needs of the business.Actively add value by lifting the Business capability of process automation.Build, enhance and maintain quality relationships with all internal and external customers. Adopt a customer focus in the delivery of internal/external services. Build positive stakeholder relationships to foster a productive environment and open communication channels.Bring new Data Science thinking to the group by staying on top of latest developments, industry meet-ups etc.Expert level knowledge of ,Gen AI/ML Python, BI and Visualization, transformation and business consulting building technical proposalKnowledge of statistical concepts – Expert levelTypically, this role would have 12 years plus of relevant experience (in a Data Science or Advanced Analytical consulting field of work). At least 10 years of leadership experience necessary.Experience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Qualifications Superior results in Bachelor Degree in highly technical subject area (statistics, actuarial, engineering, maths, programming etc). Post graduate degree in relevant statistical or data science related area (or equivalent & demonstrable online study). Key Capabilities/Technical Competencies (skills, knowledge, technical or specialist capabilities)MandatoryProven ability to engage in a team to achieve individual, team and divisional goals. Consulting, transformation, building proposal & competencyLead and manage largescale teams from people, project management and client management perspective Solid programming experience in Tableau, R, SQL and Python (AI/ML.Experience with AWS or other cloud service.Familiar with data lake platforms (e.g. Snowflake and Databricks)Demonstrated understanding of advanced machine learning algorithms including some exposure to NLP and Image Processing.Expert level understanding of statistical conceptsPlanning and organization – Advanced levelDelegation, project management, delivery and productionizing analytical services – Advanced High degree of specialist expertise within one or more data science capabilities eg. unstructured data analysis, cognitive/AI solutions (incl. use of API platforms such as IBM Watson, MS Azure, AWS etc), Hadoop/Spark based ecosystems etc.Familiarity with Gen AI conceptsHighly ValuedGood understanding of the Insurance products, industry, market environment, customer segment and key business drivers.Strong knowledge of finance, Budgeting/Forecasting of key business drivers, with the ability to interpret and analyse reports relevant to area of responsibility.Additional RequirementsCreativity and Innovation - A curious mind that does not accept the status quo. Design thinking experience highly valued. Communication Skills – Superior communication skills to be able to co-design solutions with customers. Emotional intelligence and the ability to communicate complex ideas to a range of internal stakeholders. Consulting skills highly valued. Business Focus - Advanced analytical skills will need to be practiced with a significant business focus to ensure practical solutions that deliver tangible value. Strategic Focus - Critically evaluate both company and key business customers' strategy. Also keeping abreast of best practice advanced analytics strategies. Change management capability - ability to recognise, understand and support need for change and anticipated impact on both the team and self. Adaptable and responsive to a continuously changing environment.Customer service - proven commitment to delivering a quality differentiated experience.Time management skills – prioritisation of work without supervision.Project management - Ability to plan, organize, implement, monitor, and control projects, ensuring efficient utilisation of technical and business resources, to achieve project objectives.Partnering - Ability to deliver solutions utilising both onshore and offshore resources. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Micoworks is a company with a clear mission: to Empower every brand for the better future . This ambitious goal sets the stage for their vision and core values. Who we are By 2030, Micoworks aims to be the Asia No.1 Brand Empowerment Company . This mid-term goal outlines their dedication to becoming the leading force in empowering brands across Asia. To achieve their mission and vision, Micoworks identifies four key values that guide their work: WOW THE CUSTOMER SMART SPEED OPEN MIND ALL FOR ONE Micoworks' mission, vision, and values paint a picture of a company dedicated to empowering brands, working with agility and open-mindedness, and prioritising customer success. Job Summary The Senior Data Scientist will work on data-driven initiatives to solve complex business challenges, leveraging advanced analytics, machine learning, and statistical modeling. This role requires expertise in translating data insights into actionable strategies and collaborating with cross-functional teams. Ideal candidates will have a strong background in analytics, or tech-driven industries.Key Responsibilities Develop and deploy predictive models (e.g., customer lifetime value, media mix modeling, time-series forecasting) using Python/R, TensorFlow, or PyTorch. Clean, preprocess, and validate large datasets (structured/unstructured) from multiple sources. Partner with stakeholders (e.g., marketing, finance) to design data-driven solutions (e.g., A/B testing) Ensure adherence to data privacy and ethical AI practices Research and implement cutting-edge techniques (e.g., NLP, deep learning) to enhance business strategies. Required Qualifications Education: Master's/PhD in Statistics, Computer Science, Econometrics, or related quantitative fields. Experience: 5+ years in data science, with expertise in: Programming: Python/R, SQL, Spark, and libraries (Pandas, Scikit-learn). Statistical methods: Decision trees, regression, DL and experimental design. Cloud platforms: Azure, Databricks, or AWS 5. Soft Skills: Strong storytelling, stakeholder management, and problem-solving. Show more Show less
Posted 5 days ago
1.0 years
0 Lacs
Kochi, Kerala, India
Remote
Job Summary We are seeking a highly experienced Azure Data Engineer for a long-term freelance engagement (1 year+) . This is a remote opportunity to work with a forward-thinking team on cloud data engineering projects using the latest Azure technologies. If you're an expert in building scalable data solutions and enjoy contract-based flexibility, we'd love to connect with you. Key Responsibilities Design and develop scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics Build and manage data lakes and data warehouses with Azure Data Lake Gen2 and Azure SQL Database Perform ETL/ELT transformations using SQL , Python , and Spark Implement and maintain CI/CD pipelines for data deployments Monitor and optimize performance using Azure Monitor , Log Analytics , and Power BI Collaborate with stakeholders to deliver clean, secure, and well-modeled data solutions Ensure data governance, compliance, and architecture best practices Required Qualifications 4+ years of experience in Azure-based data engineering Proficiency in Azure Data Factory , Synapse , Databricks , Azure SQL , and Data Lake Gen2 Strong programming and scripting skills in SQL , Python , and Spark Solid understanding of data modeling , warehousing , and pipeline orchestration Experience with CI/CD and tools like Azure DevOps Familiarity with monitoring tools: Azure Monitor , Log Analytics , Power BI Excellent problem-solving and communication skills Preferred Qualifications Microsoft Certified: Azure Data Engineer Associate (DP-203) Experience with real-time data streaming (e.g., Azure Stream Analytics, Kafka) Exposure to big data environments and data governance tools like Purview , Informatica , or Data Catalog What We Offer Long-term contract (12 months or more) 100% remote work Opportunity to work on cutting-edge Azure projects Competitive freelance compensation Supportive and collaborative remote team culture 📩 Interested in this long-term opportunity? Let’s connect and discuss further! Show more Show less
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Scientist Lead Data Scientist Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. The New Product Engineering team within Data and Services is engineering the transformation of new ideas into software solutions to test in the market, and then growing and scaling products to meet long-term demand and needs. We achieve this with a "move fast, learn fast" mindset, relentless focus on customer excellence, and a culture of collaboration and empowerment. Our solutions are instrumental in positioning Mastercard as a data & services market leader. Role and Summary Data Scientist The Data Scientist in Data & Services New Product Engineering team is responsible for creating data & analytics solutions including deep learning Artificial Intelligence (A.I.) and Machine Learning (M.L.) models that sit atop large datasets of business and finance operations gathered by mid to large companies. The Data Scientist is responsible for developing full life cycle modeling processes of data analysis, feature engineering, model training, testing, serving and monitoring. As a Data Scientist, You Will Work closely with the business owners to understand business requirements, performance metrics regarding data quality and model performance of customer facing products Work with multiple disparate sources of data, storage systems, and building processes and pipelines to provide cohesive datasets for analysis and modeling Generate and maintain and optimize data pipelines for model building and model performance evaluation Develop, test, and evaluate modern machine learning and A.I. models Oversee implementation of models Evaluate production models based on business metrics to drive continuous improvement Essential Skills Data engineering and data science experience Experience with SQL language and one or multiple of the following database technologies: PostgreSQL, Hadoop, Spark. Good knowledge of Linux / Bash environment Python and one of the following machine learning libraries Spark ML TensorFlow Scikit Learn XGBoost Good communication skills Highly skilled problem solver Exhibits a high degree of initiative At least an undergraduate degree in CS, or a STEM related field. Nice To Have Bachelor’s or Master’s in CS, Data Science, ML/AI, or a related STEM field Understands and implements methods to evaluate own work and others for bias, inaccuracy, and error Databricks Loves working with error-prone, messy, disparate, unstructured data Experience in working with Cloud (e.g., Azure, AWS) Experience participating in complex engineering projects in an Agile setting e.g. Scrum Mastercard is an equal opportunity employer that values diversity and inclusion. Applicants will be considered and treated without regard to gender, gender identity, race, color, ethnicity, national origin, religion, sexual orientation, veteran or disabled status, or any other characteristic protected by applicable law. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251483 Show more Show less
Posted 5 days ago
8.0 - 13.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description The Role We are looking for talented UI engineers who are passionate about building high quality, modern, front-end solutions that solve complex business problems via innovation and the application of engineering best practices. This role will be responsible for hands-on development of front-end code, working as part of an agile scrum team, and partnering with UX and other engineering specialists to build digital products of the highest standard. Responsibilities Contribute to the development of world-class enterprise applications leveraging the latest technologies and software design patterns Develop and engineer front end solutions within an Agile software delivery team, working to collaboratively deliver sprint goals, write code, and participate in the broader Citi technical community and team-level Agile and Scrum processes Apply knowledge and expertise to all aspects of the software development lifecycle, ensuring software is built to the highest standards Partner with stakeholders, UX and QA on a regular basis Grow and develop subject matter expertise for the relevant area of business Provide support and assistance for new joiners and junior team members Follow and contribute to technical and team standards and practices Collaborate with technical leadership to ensure work is aligned to the broad technical strategy Required Qualifications 8-13 years as a Software Engineer/Developer using modern front-end technologies (Angular, React, Vue, Next, etc.) Experience using modern build tools for front-end solutions (npm, yarn, gulp etc.) Exposure to test strategies and frameworks for UIs (jasmine, karma etc.) Clear understanding of software engineering best practices (unit testing, automation, design patterns etc.) Bachelor's degree in engineering, computer science, computer engineering, or equivalent work experience Preferred Qualifications Exposure to Service Oriented and MicroServices architectures, including REST and GraphQL implementations Exposure to building horizontally scalable, highly available, highly resilient, and low latency applications Exposure to Cloud infrastructure both on-premises and public cloud (i.e., OpenShift, AWS, etc.) Exposure to Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) Exposure to API Management tools Exposure to event-driven design and architecture (Kafka, Spark Flink, etc.) Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premise or public cloud (i.e., Tekton, Harness, CircleCI, Cloudbees Jenkins, etc.) Exposure to using Infrastructure as Code tools (Terraform, Cloudformation, etc.) Exposure to Security, Observability, and Monitoring tools (Grafana Prometheus, Splunk, ELK, CloudWatch, etc.) Exposure to agile and iterative software delivery Exposure to database concepts (RDBMS, NoSQL) ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 5 days ago
2.0 - 4.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a Content Specialist – BCG Vantage within the Content Management team, you will help to build & improve BCG's content base. The primary focus will be content management activities, such as content curation, content capturing, managing & sharing meaningful content across various document repositories and bringing it back to the broader BCG community. You will also be required to work closely with the Content Manager & Practice Area (PA) on special projects to improve the quality of content on our intranet database. You will provide primary content capture & curation support for Climate & Sustainability (C&S) cases and related materials for BCG’s content management system. This includes connecting with case teams, seeking information, and writing and indexing case descriptions on our internal portal. You will be closely working with the C&S PA in capturing and publishing case documents and practice area materials to improve accuracy and quality. A key tenant of this role involves supporting the Climate & Sustainability Practice Area to execute priority content projects such as case vignette capture, client reference capture, newsletters, etc. To achieve this, you will work closely with the Content Manager. This is a non-client facing role. Climate & Sustainability is a fast-growing practice and a driving force for BCG's ambition to become the most positively impactful company in the world. We believe that we can transform how private sector creates competitive advantage and support the public sector in striding towards net zero & sustainability ambition. Together with our clients, we believe we can help solve some of the most pressing social and environmental challenges. You can find more about BCG's own sustainability ambition in BCG's 2024 sustainability report Scaling Impact in a Changing world. What You'll Bring Bachelor’s degree required – preferably in business, or related research/analysis-intense field 2-4 years of relevant work experience or equivalent preferably in the Climate & Sustainability industry Expertise in relevant sector/ topic Fluency in English Excellent business writing skills Strong analytical capabilities (e.g., Excel, Tableau, PowerBi or similar) with proven ability to analyse content needs and gaps of the PA and strategically aid in the process of defining content priorities Expert understanding of Generative AI tools to be leveraged in day-to-day work Knowledge of business documents such as Proposals, credentials, case studies etc. is desirable. Knowledge of content analytics and reporting will be an advantage. Outstanding interpersonal and communication skills to interact with internal and external stakeholder while working in a global collaborative team environment Who You'll Work With The BCG Vantage team that works in close collaboration with the case teams and other groups within our firm to help create, retrieve, organize, and analyze the knowledge that enables BCG to deliver superior business value for clients. Our role is to be a trusted partner and catalyst for all parts of BCG in building the development of knowledge as a core competitive advantage – and advancing our firm’s reputation as a global leader in business consulting. Ultimately, our efforts create a firm-wide culture of knowledge sharing and collaboration. Additional info Content Management is a key capability within BCG Vantage that owns the end-to-end responsibility for curating and maintaining important parts of the firm’s intellectual property. We deliver greater productivity and speed-to-impact for our case teams to further our clients’ priorities. Leveraging our skills and knowledge of topical content, we team to deliver the information that powers BCG to gain access to the right experts, IP, data, and tools. YOU'RE GOOD AT Understanding content management concepts & comprehension of content management as an area of work Managing stakeholders effectively; you are proactive, persistent, confident and able to engage effectively with Director-level stakeholders and global case teams Working in a well-organized, self-starting fashion with good prioritization skills and the ability to work autonomously and as part of a global team Adapting per stakeholder requirements with excellent process and planning skills – strong follow through and accountability essential Developing specialized technical and operational skills related to the function/PA. Identify and evaluate upcoming trends and topics within their function to build a stronger knowledge base Advancing knowledge of primary function or PA – share best practices and upcoming trends within the team/PA Independent and autonomous interactions and communication with stakeholders, thereby delivering high quality output Cross - team projects, fostering collaboration and innovation in the job to improve processes/projects Ability to pressure test solutions to problems; assess potential challenges and proactively deals with problems; assists Junior Specialists with daily work problems Testing & trying available Generative AI tools to enhance the content management process efficiencies Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 5 days ago
2.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less
Posted 5 days ago
2.0 - 10.0 years
0 Lacs
Ghaziabad, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are a proud work-from-office company. If you're ready to work on-site in a dynamic, global company, we’d love to hear from you. Position Summary Do you have a passion for building data architectures that enable smooth and seamless product experiences? Are you an all-around data enthusiast with a knack for ETL? We're hiring Data Engineers to help build and optimize the foundational architecture of our product's data. We’ve built a strong data engineering team to date, but have a lot of work ahead of us, including: Migrating from relational databases to a streaming and big data architecture, including a complete overhaul of our data feeds Defining streaming event data feeds required for real-time analytics and reporting Leveling up our platform, including enhancing our automation, test coverage, observability, alerting, and performance As a Data Engineer, you will work with the development team to construct a data streaming platform and data warehouse that serves as the data foundations for our product. Help us scale our business to meet the needs of our growing customer base and develop new products on our platform. You'll be a critical part of our growing company, working on a cross-functional team to implement best practices in technology, architecture, and process. You’ll have the chance to work in an open and collaborative environment, receive hands-on mentorship and have ample opportunities to grow and accelerate your career! Responsibilities Build our next generation data warehouse Build our event stream platform Translate user requirements for reporting and analysis into actionable deliverables Enhance automation, operation, and expansion of real-time and batch data environment Manage numerous projects in an ever-changing work environment Extract, transform, and load complex data into the data warehouse using cutting-edge technologies Build processes for topnotch security, performance, reliability, and accuracy Provide mentorship and collaborate with fellow team members Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, Operations Research, or related field required 3+ years of experience building data pipelines 3+ years of experience building data frameworks for unit testing, data lineage tracking, and automation Fluency in Scala is required Working knowledge of Apache Spark Familiarity with streaming technologies (e.g., Kafka, Kinesis, Flink) Nice-to-Haves Experience with Machine Learning Familiarity with Looker a plus Knowledge of additional server-side programming languages (e.g. Golang, C#, Ruby) PrismHR is a fast-paced SaaS company which provides customers with a cloud-based payroll process software application. PrismHR also provides professional services including system implementation consulting, custom configurations, and training. Lastly, via the Company’s Marketplace platform customers and end users access other human resources and employee benefits applications from PrismHR’s Marketplace Partners. Diversity, Equity And Inclusion Program/Affirmative Action Plan We have transformed our company into an inclusive environment where individuals are valued for their talents and empowered to reach their fullest potential. At PrismHR, we strive to continually lead with our values and beliefs that enable our employees to develop their potential, bring their full self to work, and engage in a world of inclusion. Ensuring an inclusive environment for our employees is an integral part of the PrismHR culture. We aren't just checking a box, we are truly committed to creating a workplace that celebrates the diversity of our employees and fosters a sense of belonging for everyone. This is essential to our success. We are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about our roles but your past experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right candidate for these open roles or other open roles. We particularly encourage applicants from traditionally under-represented groups as we seek to increase the diversity of our workforce and provide fair opportunities for all. As a proud Equal Opportunity and Affirmative Action Employer, PrismHR encourages talent from all backgrounds to join our team. Employment decisions are based on an individual’s qualifications as they relate to the job under consideration. The Company’s policy prohibits unlawful discrimination based on sex (which includes pregnancy, childbirth, breastfeeding, or related medical conditions, the actual sex of the individual, or the gender identity or gender expression), race, color, religion, including religious dress practices and religious grooming practices, sexual orientation, national origin, ancestry, citizenship, marital status, familial status, age, physical disability, mental disability, medical condition, genetic information, protected veteran or military status, or any other consideration made unlawful by federal, state or local laws, ordinances, or regulations. The Company is committed to complying with all applicable laws providing equal employment opportunities. This commitment applies to all persons involved in the operations of the Company and prohibits unlawful discrimination by any employee of the Company, including supervisors and co-workers. Privacy Policy: For information about how we collect and use your personal information, please see our privacy statement available at https://www.prismhr.com/about/privacy-policy. PrismHR provides reasonable accommodation for qualified individuals with disabilities and disabled veterans in job application procedures. If you have any difficulty using our online system and you need a reasonable accommodation due to a disability, you may use the following alternative email address to contact us about your interest in employment at PrismHR: taglobal@prismhr.com. Please indicate in the subject line of your email that you are requesting accommodation. Only candidates being considered for a position who require an accommodation will receive a follow-up response. Show more Show less
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.
These cities have a high concentration of tech companies and startups actively hiring for Spark roles.
The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
Salaries may vary based on the company, location, and specific job requirements.
In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect
Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.
Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases
Having a combination of these skills can make a candidate more competitive in the job market.
As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.