Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Must have minimum experience of 5 years working on developing data pipelines Must have worked on data engineering using Python for at least 5 years Must have experience working on big data technologies for at least 5 years Must have experience in exploring and proposing solutions and come up with architecture and technical design Must have experience working on Docker Must have experience working on AWS Must have working experience on DevOps pipelines Must be a excellent technical leader who can take responsibility of a team and own it Must have experince practicing agile methods - scrum and kanban Must have experience interacting with US customers on day to day basis
Posted 1 month ago
6.0 - 8.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Immediate Job Openings on #Big Data Engineer _ Pan India_ Contract Experience: 6 +Years Skill:Big Data Engineer Location: Pan India Notice Period:Immediate. Employment Type: Contract Pyspark Azure Data Bricks Experience on workflows Unity catalog Managed / external data with delta tables.
Posted 1 month ago
3.0 - 4.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Creates and owns technical solutions for a specific technology area(GCP, SQL, BQ, Python, Airflow), delivering work autonomously. GCP migration experience with 3-4 years !! Must have GCP /Aws certificate.
Posted 1 month ago
3.0 - 5.0 years
14 - 19 Lacs
Mumbai, Pune
Work from Office
Company: Marsh McLennan Agency Description: Marsh McLennan is seeking candidates for the following position based in the Pune office. Senior Engineer/Principal Engineer What can you expect We are seeking a skilled Data Engineer with 3 to 5 years of hands-on experience in building and optimizing data pipelines and architectures. The ideal candidate will have expertise in Spark, AWS Glue, AWS S3, Python, complex SQL, and AWS EMR. What is in it for you Holidays (As Per the location) Medical & Insurance benefits (As Per the location) Shared Transport (Provided the address falls in service zone) Hybrid way of working Diversify your experience and learn new skills Opportunity to work with stakeholders globally to learn and grow We will count on you to: Design and implement scalable data solutions that support our data-driven decision-making processes. What you need to have: SQL and RDBMS knowledge - 5/5. Postgres. Should have extensive hands-on Database systems carrying tables, schema, views, materialized views. AWS Knowledge. Core and Data engineering services. Glue/ Lambda/ EMR/ DMS/ S3 - services in focus. ETL data:dge :- Any ETL tool preferably Informatica. Data warehousing. Big data:- Hadoop - Concepts. Spark - 3/5 Hive - 5/5 Python/ Java. Interpersonal skills:- Excellent communication skills and Team lead capabilities. Understanding of data systems well in big organizations setup. Passion deep diving and working with data and delivering value out of it. What makes you stand out Databricks knowledge. Any Reporting tool experience. Preferred MicroStrategy. Marsh McLennan (NYSEMMC) is the worlds leading professional services firm in the areas ofrisk, strategy and people. The Companys more than 85,000 colleagues advise clients in over 130 countries.With annual revenue of $23 billion, Marsh McLennan helps clients navigate an increasingly dynamic and complex environment through four market-leading businesses.Marshprovides data-driven risk advisory services and insurance solutions to commercial and consumer clients.Guy Carpenter develops advanced risk, reinsurance and capital strategies that help clients grow profitably and pursue emerging opportunities. Mercer delivers advice and technology-driven solutions that help organizations redefine the world of work, reshape retirement and investment outcomes, and unlock health and well being for a changing workforce. Oliver Wyman serves as a critical strategic, economic and brand advisor to private sector and governmental clients. For more information, visit marshmclennan.com , or follow us onLinkedIn andX . Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people regardless of their sex/gender, marital or parental status, ethnic origin, nationality, age, background, disability, sexual orientation, caste, gender identity or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person. Marsh McLennan (NYSEMMC) is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marshmclennan.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 1 month ago
6.0 - 9.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Lead production support for a Finance Datastage ETL Platform for a 24*7 SLA based engagement. Reviewing job designs and code created by other DataStage developers. Provide support for ongoing inbound/outbound data loads and issue resolution. Monitor DataStage job execution and troubleshoot issues as needed. Troubleshoot DataStage job failures and provide root cause analysis. Optimize existing DataStage jobs for performance (as necessary).
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
As a Technical Specialist, you will contribute to the development and enhancement of Network Management applications for Optical Transport, SDH/SONET, and Photonic/WDM networks. Your role will involve designing, coding, and maintaining high-quality software solutions that enable seamless network configuration, fault supervision, and performance monitoring. With expertise in Core Java, Spring, Kafka, Zookeeper, Hibernate, Python, and React, you will develop scalable applications while integrating advanced algorithms to optimize network management. You will also work with RDBMS, PL-SQL, Linux, Docker, and database concepts, with domain knowledge in OTN and Photonic network management as a valuable addition. In this dynamic environment, you will collaborate with cross-functional teams to deliver innovative solutions that improve functionality and enhance customer satisfaction. You Have: Bachelor's degree with 5 to 8 years of experience. Hands-on working experience with CORE JAVA, Spring, Kafka, Zookeeper, Hibernate, and Python. Hands-on working experience with UI technologies like REACT. It would be nice if you also had: Working knowledge of RDBMS, PL-SQL, Linux, Docker, and database concepts is required. Domain knowledge in OTN and Photonic network management is a plus. Should have hands-on experience in development activities such as requirement analysis, design, coding, and unit testing. Network Management of Optics Division products, including Photonic/WDM, Optical Transport, and SDH/SONET. Applications provide users with control over the network in terms of configuration (infrastructure, end-to-end services), fault supervision, and performance monitoring. Interface with different Network Elements, provide a friendly graphical user interface, and implement algorithms and functions to enable easier network management and OPEX reduction. Optics Network Management applications are deployed worldwide in several hundred installations, serving both large and small customers. Development team and will contribute to new developments and the maintenance of applications, aiming to enhance functionality and customer satisfaction.
Posted 1 month ago
8.0 - 13.0 years
12 - 16 Lacs
Chennai
Work from Office
Project description We need a Senior Python and Pyspark Developer to work for a leading investment bank client. Responsibilities Develop software applications based on business requirements Maintain software applications and make enhancements according to project specifications Participate in requirement analysis, design, development, testing, and implementation activities Propose new techniques and technologies for software development. Perform unit testing and user acceptance testing to evaluate application functionality Ensure to complete the assigned development tasks within the deadlines Work in compliance with coding standards and best practices Provide assistance to Junior Developers when needed. Perform code reviews and recommend improvements. Review business requirements and recommend changes to develop reliable applications. Develop coding documentation and other technical specifications for assigned projects. Act as primary contact for development queries and concerns. Analyze and resolve development issues accurately. Skills Must have 8+ years of experience in data intensive Pyspark development. Experience as a core Python developer. Experience developing Classes, OOPS, exception handling, parallel processing . Strong knowledge of DB connectivity, data loading , transformation, calculation. Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Concurrency and error handling data pipeline batch of size [1-10 gb]. Ability to understand business requirements and translate them into technical requirements. Ability to design architecture of data pipeline for concurrent data processing. Familiar with creating/designing RESTful services and APIs. Familiar with application unit tests. Working with Git source control Service-orientated architecture, including the ability to consider integrations with other applications and services. Debugging application. Nice to have Knowledge of web backend technology Django, Python, PostgreSQL. Apache Airflow Atlassian Jira Understanding of Financial Markets Asset Classes (FX, FI, Equities, Rates, Commodities & Credit), various trade types (OTC, exchange traded, Spot, Forward, Swap, Options) and related systems is a plus Surveillance domain knowledge, regulations (MAR, MIFID, CAT, Dodd Frank) and related Systems knowledge is certainly a plus Other Languages EnglishC2 Proficient Seniority Senior
Posted 1 month ago
5.0 - 9.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Senior Data Scientist with Databricks and Predictive Analytics Focus. Responsibilities Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML Build end-to-end ML pipelines (data ingestion feature engineering model training deployment) on Databricks Lakehouse Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking Collaborate with engineering teams to operationalize models (batch/real-time) using Databricks Jobs or REST APIs Implement Delta Lake for scalable, ACID-compliant data workflows. Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions Troubleshoot issues in Spark Jobs and Databricks Environment. Client is in the USA. Candidate should be able to work until 11.00 am EST to overlap a few hours with the client and be able to attend meetings. Skills Must have : 5+ years in predictive analytics, with expertise in regression, classification, time-series modeling Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark Familiarity with MLflow, Feature Store, and Unity Catalog for governance. Industry experience in Life Insurance or P&C. Skills: Python, PySpark , MLflow , Databricks AutoML Predictive Modelling ( Classification , Clustering , Regression , timeseries and NLP) Cloud platform (Azure/AWS) , Delta Lake , Unity Catalog Nice to have Certifications: Databricks Certified ML Practitioner OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 1 month ago
7.0 - 12.0 years
11 - 15 Lacs
Gurugram
Work from Office
Project description We are looking for an experienced Data Engineer to contribute to the design, development, and maintenance of our database systems. This role will work closely with our software development and IT teams to ensure the effective implementation and management of database solutions that align with client's business objectives. Responsibilities The successful candidate would be responsible for managing technology in projects and providing technical guidance/solutions for work completion: (1.) To be responsible for providing technical guidance/solutions (2.) To ensure process compliance in the assigned module and participate in technical discussions/reviews (3.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations (4.) Being self-organized, focused on develop on time and quality software Skills Must have At least 7 years of experience in development in Data specific projects. Must have working knowledge of streaming data Kafka Framework (kSQL/Mirror Maker etc) Strong programming skills in at least one of these programming language Groovy/Java Good knowledge of Data Structure, ETL Design, and storage. Must have worked in streaming data environments and pipelines Experience working in near real-time/Streaming Data pipeline development using Apache Spark/Streamsets/ Apache NIFI or similar frameworks Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 1 month ago
5.0 - 8.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Must have skills Azure DataBricks, Python And Pyspark, Spark. Please find the JD In the mail chain Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on experience in Azure Databricks , Data Factory, Data Lake store/Blob storage, SQL DB Experience in creating Big data Pipelines with Azure components Hands on programing with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Experience with messaging systems, such as Kafka or RabbitMQ Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with designing and implementing Big data solutions.
Posted 1 month ago
6.0 - 11.0 years
8 - 14 Lacs
Pune
Work from Office
Responsibilities: designing, developing, and maintaining scalable data pipelines using Databricks, PySpark, Spark SQL, and Delta Live Tables. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines. Implement best practices for data engineering, including data quality, and data security. Optimize and troubleshoot complex data workflows to ensure high performance and reliability. Develop and maintain documentation for data engineering processes and solutions. Requirements: Bachelor's or Master's degree. Proven experience as a Data Engineer, with a focus on Databricks, PySpark, Spark SQL, and Delta Live Tables. Strong understanding of data warehousing concepts, ETL processes, and data modelling. Proficiency in programming languages such as Python and SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and communication skills, with the ability to mentor and guide team members.
Posted 1 month ago
6.0 - 11.0 years
3 - 7 Lacs
Karnataka
Hybrid
PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills
Posted 1 month ago
0.0 years
6 - 9 Lacs
Hyderābād
On-site
Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks: Understand the Business Problem and the Relevant Data Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems: Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data. Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements: Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 0-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development. Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred. About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.
Posted 1 month ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad
Work from Office
6+ years of experience in Data engineering projects using COSMOS DB- Azure Databricks (Min 3-5 projects) Strong expertise in building data engineering solutions using Azure Databricks, Cosmos DB Strong T-SQL programming skills or with any other flavor of SQL Experience working with high volume data, large objects, complex data transformations Experience working in DevOps environments integrated with GIT for version control and CI/CD pipeline. Good understanding of data modelling for data warehouse and data marts Strong verbal and written communication skills Ability to learn, contribute and grow in a fast phased environment Nice to have: Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, ADLS Gen2, Azure Events Hub Experience using Jira and ServiceNow in project environments Experience in implementing Datawarehouse and ETL solutions
Posted 1 month ago
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Minimum 3 to 5 years of Talend Developer experience. Work on the User stories and develop the Talend jobs development following the best practices. Create detailed technical design documents of talend jobs development work. Work with the SIT team and involve for defect fixing for Talend components. Note: Maximo IBM tool knowledge would have an advantage for Coned otherwise it is Ok.
Posted 1 month ago
6.0 - 11.0 years
6 - 9 Lacs
Hyderabad
Work from Office
At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities.
Posted 1 month ago
4.0 - 9.0 years
15 - 19 Lacs
Pune
Work from Office
: Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: PySpark. Experience5-8 Years.
Posted 1 month ago
5.0 - 10.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Big Data Developer - Spark,Scala,Pyspark BigDataDeveloper - Spark, Scala, Pyspark Coding & scripting Years of Experience5 to 12 years LocationBangalore Notice Period0 to 30 days Key Skills: - Proficient in Spark,Scala,Pyspark coding & scripting - Fluent inbigdataengineering development using the Hadoop/Spark ecosystem - Hands-on experience inBigData - Good Knowledge of Hadoop Eco System - Knowledge of cloud architecture AWS -Dataingestion and integration into theDataLake using the Hadoop ecosystem tools such as Sqoop, Spark, Impala, Hive, Oozie, Airflow etc. - Candidates should be fluent in the Python / Scala language - Strong communication skills 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Python for Insights. Experience5-8 Years.
Posted 1 month ago
5.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Scala programming. Experience5-8 Years.
Posted 1 month ago
1.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance
Posted 1 month ago
5.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Scala programming. Experience5-8 Years.
Posted 1 month ago
4.0 - 8.0 years
5 - 15 Lacs
Kolkata
Work from Office
DBX - L3 Required Skills
Posted 1 month ago
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.
Posted 1 month ago
5.0 - 8.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France