Home
Jobs

3786 Hadoop Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

28 - 30 Lacs

Pune

On-site

GlassDoor logo

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 3 days ago

Apply

3.0 years

0 Lacs

Mumbai

Remote

GlassDoor logo

Experience: 3 to 4 Years Y Location: Mumbai, Maharashtra India Openings: 2 Job description: Key responsibilities: Apply design and data analysis techniques to organize the presentation of data in innovative ways, collaborate with research analysts to identify the best means of visually depicting a story Design and Develop custom dashboard solutions, as well as re-usable data visualization templates Analyze data, and identify trends and discover insights that will guide strategic leadership decisions On daily practise use JavaScript, Tableau, QlikView, QlikSense, SAS Visual Analytics, PowerBI, Dashboard design/development. Desired Qualifications: M.sc or PhD in corresponding fields; Hands-on experience of programming languages (e.g., Python, Java, Scala) and/ or Big Data systems (like Hadoop, Spark, Storm); Experience with Linux, Unix shell scripting, noSQL, Machine Learning; Knowledge and experience with cloud environments like AWS/Azure/GCP; Knowledge of Scrum, Agile. Requirement: Required Qualifications: Experience with visual reports and dynamic dashboards design and development on platforms like Tableau, Qlik, PowerBI, SAS, or CRM Analytics. Experience with SQL, ETL, data warehousing, BI. Knowledge of Big Data. Strong verbal and written communication skills in English. Benefits: Competitive salary 2625 – 4500 EUR gross Flexible vacation + health & travel insurance + relocation Work from home, flexible working hours Work with Fortune 500 companies from different industries all over the world Skills development and training opportunities, company-paid certifications Opportunities to advance career An open-minded and inclusive company culture Role: Visualization Expert Department: UI/UX Education: Bachelor’s Degree from Computer Science, Statistics, Applied Mathematics, or another related field

Posted 3 days ago

Apply

2.0 - 5.0 years

10 Lacs

Pune

On-site

GlassDoor logo

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Job Description We are seeking a Data Scientist to develop next-generation Security Analytics products. You will work closely with engineers and product managers to prototype, design, develop, and optimize data-driven security solutions. As a Data Scientist, you will focus on consolidating and analysing diverse data sources to extract meaningful insights that drive product innovation and process optimization. The ideal candidate has a strong background in machine learning, especially in the Natural Language Processing. Responsibilities: Design, develop, and deploy Machine Learning models. Collaborate with Product Management and cross-functional stakeholders to define problem statements, develop solution strategies, and design scalable ML systems. Leverage Large Language Models (LLMs) to build GenAI capabilities to drive business impact. Create insightful data visualisations, technical reports, and presentations to communicate findings to technical and non-technical audiences. Deploy ML models in production and implement monitoring frameworks to ensure model performance, stability, and continuous improvement. Requirements: BS, MS, or Ph.D. in Computer Science, Statistics, or a related field. 2-5 years of experience in Machine Learning projects, including model development, deployment, and optimization. Deep understanding of ML algorithms, their mathematical foundations, and real-world trade-offs. Expertise in NLP techniques like Named Entity Recognition, Information Retrieval, Text classification, Text-to-Text Generation etc. Familiarity and working experience of GenAI applications, Experience with prompting, fine-tuning and optimizing LLMs. Knowledge of recent advancements and trends in GenAI, demonstrating a commitment to continuous learning in this rapidly evolving field. Hands-on experience with ML frameworks such as Scikit-Learn, TensorFlow, PyTorch, LangChain, vLLM etc. Strong programming skills in Python and/or Java. Experience in SQL, Pandas, and PySpark for efficient data manipulation. Familiarity with microservice architectures, CI/CD, MLOps best practices. Strong communication, problem-solving, and analytical skills; a collaborative team player. Nice to Have: Familiarity with distributed computing frameworks such as Hadoop, Spark, and OpenSearch. Published research in AI/ML in peer-reviewed journals or top conferences (e.g., NeurIPS, ICML, CVPR). Prior experience applying AI/ML to cybersecurity use cases. Basic proficiency in Unix/Linux environments for scripting and automation.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Overview: The Technology Solution Delivery - Front Line Manager (M1) is responsible for providing leadership and day-to-day direction to a cross functional engineering team. This role involves establishing and executing operational plans, managing relationships with internal and external customers, and overseeing technical fulfillment projects. The manager also supports sales verticals in customer interactions and ensures the delivery of technology solutions aligns with business needs. What you will do: Build strong relationships with both internal and external stakeholders including product, business and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters. Mentor, coach and develop junior and senior software, quality and reliability engineers. Collaborate with the architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs. Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Drive technical documentation including support, end user documentation and run books. Lead Sprint planning, Sprint Retrospectives, and other team activities Implement architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate Provides coaching, leadership and talent development; ensures teams functions as a high-performing team; able to identify performance gaps and opportunities for upskilling and transition when necessary. Drives culture of accountability through actions and stakeholder engagement and expectation management Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Oversee systems designs within the scope of the broader area, and review product or system development code to solve ambiguous problems Identify and resolve problems affecting day-to-day operations Set priorities for the engineering team and coordinate work activities with other supervisors Cloud Certification Strongly Preferred What experience you need: BS or MS degree in a STEM major or equivalent job experience required 10+ years’ experience in software development and delivery You adore working in a fast paced and agile development environment You possess excellent communication, sharp analytical abilities, and proven design skills You have detailed knowledge of modern software development lifecycles including CI / CD You have the ability to operate across a broad and complex business unit with multiple stakeholders You have an understanding of the key aspects of finance especially as related to Technology. Specifically including total cost of ownership and value You are a self-starter, highly motivated, and have a real passion for actively learning and researching new methods of work and new technology You possess excellent written and verbal communication skills with the ability to communicate with team members at various levels, including business leaders What Could Set You Apart UI development (e.g. HTML, JavaScript, AngularJS, Angular4/5 and Bootstrap) Source code control management systems (e.g. SVN/Git, Subversion) and build tools like Maven Big Data, Postgres, Oracle, MySQL, NoSQL databases (e.g. Cassandra, Hadoop, MongoDB, Neo4J) Design patterns Agile environments (e.g. Scrum, XP) Software development best practices such as TDD (e.g. JUnit), automated testing (e.g. Gauge, Cucumber, FitNesse), continuous integration (e.g. Jenkins, GoCD) Linux command line and shell scripting languages Relational databases (e.g. SQL Server, MySQL) Cloud computing, SaaS (Software as a Service) Atlassian tooling (e.g. JIRA, Confluence, and Bitbucket) Experience working in financial services Experience working with open source frameworks; preferably Spring, though we would also consider Ruby, Apache Struts, Symfony, Django, etc. Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Behaviors: Customer-focused with a drive to exceed expectations. Demonstrates integrity and accountability. Intellectually curious and driven to innovate. Values diversity and fosters collaboration. Results-oriented with a sense of urgency and agility. Show more Show less

Posted 3 days ago

Apply

8.0 years

28 - 30 Lacs

Chennai

On-site

GlassDoor logo

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 3 days ago

Apply

0 years

0 - 0 Lacs

Tiruchchirāppalli

On-site

GlassDoor logo

A data scientist collects and analyzes large datasets to uncover insights and create solutions that support organizational goals. They combine technical, analytical, and communication skills to interpret data and influence decision-making. Key Responsibilities: Gather data from multiple sources and prepare it for analysis. Analyze large volumes of structured and unstructured data to identify trends and patterns. Develop machine learning models and predictive algorithms to solve business problems. Use statistical techniques to validate findings and ensure accuracy. Automate processes using AI tools and programming. Create clear, engaging visualizations and reports to communicate results. Work closely with different teams to apply data-driven insights. Stay updated with the latest tools, technologies, and methods in data science. Tools and Technologies: Programming languages: Python, R, SQL. Data visualization: Tableau, Power BI, matplotlib. Machine learning frameworks: TensorFlow, Scikit-learn, PyTorch. Big data platforms: Apache Hadoop, Spark. Cloud platforms: AWS, Azure, Google Cloud. Statistical tools: SAS, SPSS. Job Type: Full-time Pay: ₹9,938.89 - ₹30,790.14 per month Schedule: Day shift Monday to Friday Morning shift Weekend availability Supplemental Pay: Performance bonus Application Question(s): Are you a immediate joiner? Location: Trichinapalli, Tamil Nadu (Preferred) Work Location: In person Application Deadline: 19/06/2025 Expected Start Date: 19/06/2025

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Noida

On-site

GlassDoor logo

Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. About the role We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. You will have an opportunity to: Work in a scrum team to design and build high-quality customer-facing software. Provide hands on technical leadership, mentoring and ensure a great user experience. Write unit, functional and end-to-end tests using mocha, chai, sinon, karateJS & codeceptJS. Help design our architecture and set code standards for ReactJS & NodeJS development. Gain product knowledge by successfully developing features for our applications. Communicate effectively with stakeholders, peers, and others. What we’re looking for: 8-10 years of development experience developing complex, scalable web-based applications. Experienced in test driven development, continuous integration, and continuous delivery. Min 6+ years of extensive MERN/MEVN (MongoDB, ExpressJS, ReactJS/VueJS and NodeJS) stack hands-on development experience. NodeJS primary with either ReactJS/VueJS/ExpressJS exposure. Experience in Electron, C++ and/or Objective C Possess good problem solving and analytical skills. Hands-on in designing and defining database schema using RDBMS and NoSQL databases. Experience in working in Agile development environment. Experience with web services, REST API and micro services. Experience with Amazon AWS services & real time data analytics technology (Hadoop, Spark, Kinesis, etc.) Experience with GIT, bitbucket, or GitHub and the Features branching workflow. Awesome Written and Oral communication skills and ability to work in a global and distributed environment with agility to mold communication for different audiences. Knowledge or experience with Github co-pilot. Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Role Expectations: Data Collection and Cleaning: Collect, organize, and clean large datasets from various sources (internal databases, external APIs, spreadsheets, etc.). Ensure data accuracy, completeness, and consistency by cleaning and transforming raw data into usable formats. Data Analysis: Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Conduct statistical analysis to support decision-making and uncover insights. Use analytical methods to identify opportunities for process improvements, cost reductions, and efficiency enhancements. Reporting and Visualization: Create and maintain clear, actionable, and accurate reports and dashboards for both technical and non-technical stakeholders. Design data visualizations (charts, graphs, and tables) that communicate findings effectively to decision-makers. Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience in generating the Descriptive , Predictive & prescriptive Insights with Gen AI using MS Copilot in PowerBI. Experience in Prompt Engineering & RAG Architectures Prepare reports for upper management and other departments, presenting key findings and recommendations. Collaboration: Work closely with cross-functional teams (marketing, finance, operations, etc.) to understand their data needs and provide actionable insights. Collaborate with IT and database administrators to ensure data is accessible and well-structured. Provide support and guidance to other teams regarding data-related questions or issues. Data Integrity and Security: Ensure compliance with data privacy and security policies and practices. Maintain data integrity and assist with implementing best practices for data storage and access. Continuous Improvement: Stay current with emerging data analysis techniques, tools, and industry trends. Recommend improvements to data collection, processing, and analysis procedures to enhance operational efficiency. Qualifications: Education: Bachelor's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field. A Master's degree or relevant certifications (e.g., in data analysis, business intelligence) is a plus. Experience: Proven experience as a Data Analyst or in a similar analytical role (typically 7+ years). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Strong knowledge of SQL and experience with relational databases. Familiarity with data manipulation and analysis tools (e.g., Python, R, Excel, SPSS). Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Technical Skills: Proficiency in SQL and data query languages. Knowledge of statistical analysis and methodologies. Experience with data visualization and reporting tools. Knowledge of data cleaning and transformation techniques. Familiarity with machine learning and AI concepts is an advantage (for more advanced roles). Soft Skills: Strong analytical and problem-solving abilities. Excellent attention to detail and ability to identify trends in complex data sets. Good communication skills to present data insights clearly to both technical and non-technical audiences. Ability to work independently and as part of a team. Strong time management and organizational skills, with the ability to prioritize tasks effectively. Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Develop/Convert the database (Hadoop to GCP) of the specific objects (tables, views, procedures, functions, triggers, etc.) from one database to another database platform Implementation of a specific Data Replication mechanism (CDC, file data transfer, bulk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Analyze discovery and analysis outcomes Lead discovery and analysis workshops/playbacks Identification of the applications dependencies, source, and target database incompatibilities. Analyze the non-functional requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Leads the team to adopt right tools for various migration and modernization method Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Where Data Does More. Join the Snowflake team. Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections. Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking. As a Senior Cloud Support Engineer , your role is to delight our customers with your passion and knowledge of Snowflake Data Warehouse. Customers will look to you for technical guidance and expert advice with regard to their effective and optimal use of Snowflake. You will be the voice of the customer regarding product feedback and improvements for Snowflake’s product and engineering teams. You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements. Based on business needs, you may be assigned to work with one or more Snowflake Priority Support customers. You will develop a strong understanding of the customer’s use case and how they leverage the Snowflake platform. You will deliver exceptional service, enabling them to achieve the highest levels of continuity and performance from their Snowflake implementation. Ideally, you have worked in a 24x7 environment, handled technical case escalations and incident management, worked in technical support for an RDBMS, been on-call during weekends, and are familiar with database release management. Ability to work the 1st/morning shift, which typically starts from 6 am IST Applicants should be flexible with schedule changes to meet business needs AS A SENIOR CLOUD SUPPORT ENGINEER AT SNOWFLAKE, YOU WILL: Drive technical solutions to complex problems, providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone Adhere to response and resolution SLAs and escalation processes in order to ensure fast resolution of customer issues that exceed expectations Demonstrate good problem-solving skills and be process-oriented Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues Document known solutions to the internal and external knowledge base Report well-documented bugs and feature requests arising from customer-submitted requests Partner with engineering teams in prioritizing and resolving customer requests Participate in a variety of Support initiatives Provide support coverage during holidays and weekends based on business needs OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE: Bachelor’s or Master’s degree in Computer Science or equivalent discipline 5+ years of experience in a Technical Support environment or a similar technical function in a customer-facing role Excellent writing and communication skills in English with attention to detail Ability to work in a highly collaborative environment across global teams In-depth knowledge of various caching mechanisms and ability to take advantage of caching strategies to enhance performance. Extensive experience with at least one major cloud service provider Advanced understanding of cloud services, architecture, and best practices. Proficient in database patch and release management. Ability to interpret systems performance metrics (CPU, I/O, RAM, Network stats) Experience with designing and implementing high availability and disaster recovery plans. NICE TO HAVE: Knowledge of distributed computing principles and frameworks (e.g., Hadoop, Spark). Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities We are seeking a highly skilled Advanced Analytics Specialist to join our dynamic team. The successful candidate will be responsible for leveraging advanced analytics techniques to derive actionable insights, inform business decisions, and drive strategic initiatives. This role requires a deep understanding of data analysis, statistical modeling, machine learning, and data visualization.. In this role, you will be responsible for architecting and delivering AI solutions using cutting-edge technologies, with a strong focus on foundation models and large language models. You will work closely with customers, product managers, and development teams to understand business requirements and design custom AI solutions that address complex challenges. Experience with tools like Github Copilot, Amazon Code Whisperer etc. is desirable. Success is our passion, and your accomplishments will reflect this, driving your career forward, propelling your team to success, and helping our clients to thrive. Day-to-Day Duties Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge Sharing: Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and Innovation: Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred Education Master's Degree Required Technical And Professional Expertise Develop and implement advanced analytical models and algorithms to solve complex business problems. Analyze large datasets to uncover trends, patterns, and insights that drive business performance. Collaborate with cross-functional teams to identify key business challenges and opportunities. Create and maintain data pipelines and workflows to ensure the accuracy and integrity of data. Design and deliver insightful reports and dashboards to communicate findings to stakeholders. Stay up to date with the latest advancements in analytics, machine learning, and data science. Provide technical expertise and mentorship to junior team members. Qualifications: Bachelor’s or Master’s degree in Data Science, Statistics, Mathematics, Computer Science, or a related field.Proven experience in advanced analytics, data science, or a similar role.Proficiency in programming languages such as Python, R, or SQL.Experience with data visualization tools like Tableau, Power BI, or similar. Strong understanding of statistical modeling and machine learning algorithms.Excellent analytical, problem-solving, and critical thinking skills.Ability to communicate complex analytical concepts to non-technical stakeholders.Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Preferred Skills Familiarity with cloud-based analytics platforms (e.g., AWS, Azure). Knowledge of natural language processing (NLP) and deep learning techniques. Experience with project management and agile methodologies Preferred Technical And Professional Experience Familiarity with cloud-based analytics platforms (e.g., AWS, Azure). Knowledge of natural language processing (NLP) and deep learning techniques. Experience with project management and agile methodologies Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary: A highly skilled Big Data (Hadoop) Administrator responsible for the installation, configuration, engineering, and architecture of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, scripting, and infrastructure-as-code for automating and optimizing operations is highly desirable. Experience in collaborating with cross-functional teams, including application development, infrastructure, and operations, is highly preferred. Job Responsibilities: Manages the design, distribution, performance, replication, security, availability, and access requirements for large and complex Big Data clusters. Designs and develops the architecture and configurations to support various application needs; implements backup, recovery, archiving, conversion strategies, and performance tuning; manages job scheduling, application release, cluster changes, and compliance. Identifies and resolves issues utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of Hadoop cluster management; consults and advises application development teams on security, query optimization, and performance. Writes scripts to automate routine cluster management tasks and documents maintenance processing flows per standards. Implement industry best practices while performing Hadoop cluster administration tasks. Works in an Agile model with a strong understanding of Agile concepts. Collaborates with development teams to provide and implement new features. Debugs production issues by analyzing logs directly and using tools like Splunk and Elastic. Address organizational obstacles to enhance processes and workflows. Adopts and learns new technologies based on demand and supports team members by coaching and assisting. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 14+ years of IT and Infrastructure engineering work experience. Experience: 14+ Years Total IT experience & 10+ Years relevant experience in Big Data database Technical Skills: Big Data Platform Management : Big Data Platform Management: Expertise in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Data Infrastructure & Security : Proficient in designing and implementing robust data infrastructure solutions with a strong focus on data security, utilizing tools like Apache Ranger and Kerberos. Performance Tuning & Optimization : Skilled in performance tuning and optimization of big data environments, leveraging advanced techniques to enhance system efficiency and reduce latency. Backup & Recovery : Experienced in developing and executing comprehensive backup and recovery strategies to safeguard critical data and ensure business continuity. Linux & Troubleshooting : Strong knowledge of Linux operating systems , with proven ability to troubleshoot and resolve complex technical issues, collaborating effectively with cross-functional teams. DevOps & Scripting : Proficient in scripting and automation using tools like Ansible, enabling seamless integration and automation of cluster operations. Experienced in infrastructure-as-code practices and observability tools such as Elastic. Agile & Collaboration : Strong understanding of Agile SAFe for Teams, with the ability to work effectively in Agile environments and collaborate with cross-functional teams. ITSM Process & Tools : Knowledgeable in ITSM processes and tools such as ServiceNow. Other Critical Requirements: Automation and Scripting : Proficiency in automation tools and programming languages such as Ansible and Python to streamline operations and improve efficiency. Analytical and Problem-Solving Skills : Strong analytical and problem-solving abilities to address complex technical challenges in a dynamic enterprise environment. 24x7 Support : Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Team Management and Leadership : Proven experience managing geographically distributed and culturally diverse teams, with strong leadership, coaching, and mentoring skills. Communication Skills : Exceptional written and oral communication skills, with the ability to clearly articulate technical and functional issues, conclusions, and recommendations to stakeholders at all levels. Stakeholder Management : Prior experience in effectively managing both onshore and offshore stakeholders, ensuring alignment and collaboration across teams. Business Presentations : Skilled in creating and delivering impactful business presentations to communicate key insights and recommendations. Collaboration and Independence : Demonstrated ability to work independently as well as collaboratively within a team environment, ensuring successful project delivery in a complex enterprise setting. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Senior Technology Consultant in FinCrime solutions modernisation and transformation projects. Should exhibit deep experience in FinCrime solutions during the client discussions and be able to convince the client about the solution. Lead and manage a team of technology consultants to be able to deliver large technology programs in the capacity of project manager. Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities. Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules. Prior experience in one of more COTS such as Norkom, Actimize, NetReveal, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed. Willing to travel to the customers locations on need basis. Mandatory skills: Technical: Application and Solution (workflow, interface) technical design Business requirements, definition, analysis, and mapping SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Hands of prior experience on NetReveal modules development Experience in product migration, implementation - preferably been part of at least 1 AML implementations. Experience in Cloud and CI/CD (Devops Automation environment) Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams. Functional : Thorough knowledge of the KYC process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PEP - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Experience in requirements gathering, documentation and gap analysis in OOTB (out of the box) vs custom features. Agile (Scrum or Kanban) Methodology Exposure in conducting or participating in product demonstration, training, and assessment studies. Analytical thinking in finding out of the box solutions with an ability to provide customization approach and configuration mapping. Excellent client-facing skills Should be able to review the test cases and guide the testing team on need basis. End to End product implementation and transformation experience is desirable. Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 8 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Description AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Excited by using massive amounts of data to develop Machine Learning (ML) and Deep Learning (DL) models? Want to help the largest global enterprises derive business value through the adoption of Artificial Intelligence (AI)? Eager to learn from many different enterprise’s use cases of AWS ML and DL? Thrilled to be key part of Amazon, who has been investing in Machine Learning for decades, pioneering and shaping the world’s AI technology? At AWS ProServe India LLP (“ProServe India”), we are helping large enterprises build ML and DL models on the AWS Cloud. We are applying predictive technology to large volumes of data and against a wide spectrum of problems. Our Professional Services organization works together with our internal customers to address business needs of AWS customers using AI. AWS Professional Services is a unique consulting team in ProServe India. We pride ourselves on being customer obsessed and highly focused on the AI enablement of our customers. If you have experience with AI, including building ML or DL models, we’d like to have you join our team. You will get to work with an innovative company, with great teammates, and have a lot of fun helping our customers. If you do not live in a market where we have an open Data Scientist position, please feel free to apply. Our Data Scientists can live in any location where we have a Professional Service office. Key job responsibilities Responsibilities A successful candidate will be a person who enjoys diving deep into data, doing analysis, discovering root causes, and designing long-term solutions. It will be a person who likes to have fun, loves to learn, and wants to innovate in the world of AI. Major responsibilities include: Understand the internal customer’s business need and guide them to a solution using our AWS AI Services, AWS AI Platforms, AWS AI Frameworks, and AWS AI EC2 Instances . Assist internal customers by being able to deliver a ML / DL project from beginning to end, including understanding the business need, aggregating data, exploring data, building & validating predictive models, and deploying completed models to deliver business impact to the organization. Use Deep Learning frameworks like MXNet, Caffe 2, Tensorflow, Theano, CNTK, and Keras to help our internal customers build DL models. Use SparkML and Amazon Machine Learning (AML) to help our internal customers build ML models. Work with our Professional Services Big Data consultants to analyze, extract, normalize, and label relevant data. Work with our Professional Services DevOps consultants to help our internal customers operationalize models after they are built. Assist internal customers with identifying model drift and retraining models. Research and implement novel ML and DL approaches, including using FPGA. This role is open for Mumbai/Pune/Bangalore/Chennai/Hyderabad/Delhi/Pune. About The Team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Basic Qualifications 7+ years of professional or military experience, including a Bachelor's degree. 7+ years managing complex, large-scale projects with internal or external customers. Assist internal customers by being able to deliver a ML / DL project from beginning to end, including understanding the business need, aggregating data, exploring data, building & validating predictive models, and deploying completed models to deliver business impact to the organization. Skilled in using Deep Learning frameworks (MXNet, Caffe2, TensorFlow, Theano, CNTK, Keras) and ML tools (SparkML, Amazon Machine Learning) to build models for internal customers. Preferred Qualifications 7+ years of IT platform implementation in a technical and analytical role experience. Experience in consulting, design and implementation of serverless distributed solutions. Experienced in databases (SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) and managing complex, large-scale customer-facing projects. Experienced as a technical specialist in design and architecture, with expertise in cloud-based solutions (AWS or equivalent), systems, networks, and operating systems. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - AWS ProServe IN - Karnataka Job ID: A3009199 Show more Show less

Posted 3 days ago

Apply

Exploring Hadoop Jobs in India

The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.

Average Salary Range

The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.

Career Path

In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.

Related Skills

In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.

Interview Questions

  • What is Hadoop and how does it work? (basic)
  • Explain the difference between HDFS and MapReduce. (medium)
  • How do you handle data skew in Hadoop? (medium)
  • What is YARN in Hadoop? (basic)
  • Describe the concept of NameNode and DataNode in HDFS. (medium)
  • What are the different types of join operations in Hive? (medium)
  • Explain the role of the ResourceManager in YARN. (medium)
  • What is the significance of the shuffle phase in MapReduce? (medium)
  • How does speculative execution work in Hadoop? (advanced)
  • What is the purpose of the Secondary NameNode in HDFS? (medium)
  • How do you optimize a MapReduce job in Hadoop? (medium)
  • Explain the concept of data locality in Hadoop. (basic)
  • What are the differences between Hadoop 1 and Hadoop 2? (medium)
  • How do you troubleshoot performance issues in a Hadoop cluster? (advanced)
  • Describe the advantages of using HBase over traditional RDBMS. (medium)
  • What is the role of the JobTracker in Hadoop? (medium)
  • How do you handle unstructured data in Hadoop? (medium)
  • Explain the concept of partitioning in Hive. (medium)
  • What is Apache ZooKeeper and how is it used in Hadoop? (advanced)
  • Describe the process of data serialization and deserialization in Hadoop. (medium)
  • How do you secure a Hadoop cluster? (advanced)
  • What is the CAP theorem and how does it relate to distributed systems like Hadoop? (advanced)
  • How do you monitor the health of a Hadoop cluster? (medium)
  • Explain the differences between Hadoop and traditional relational databases. (medium)
  • How do you handle data ingestion in Hadoop? (medium)

Closing Remark

As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies