Jobs
Interviews

8340 Hadoop Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

4 - 6 Lacs

Noida, Gurugram

Work from Office

What Youll Do Develop advanced and efficient statistically effective algorithms that solve problems of highdimensionality. Utilizetechnical skills such as hypothesis testing, machine learning and retrieval processes to apply statistical and data mining techniques toidentify trends, createfigures, and analyze other relevantinformation. Collaborate with clients and other stakeholders at ZS to integrate and effectively communicate analysis findings. Contribute to the assessment of emerging datasets and technologies thatimpact our analytical platform. What Youll Bring Amasters degree in computer science, Statistics, or a relevant field; A robust academic performance history with coursework emphasizing analysis and quantitative skills. A knowledge of big data, advanced analytical concepts, and algorithms (e.g., text mining, social listening, recommender systems, predictive modeling, etc A proficiency in at least one programming language (e.g., Java/Python/R). Experience with tools/platforms such as the Hadoop eco system, Amazon Web Services or database systems. Fluency in English.

Posted 1 week ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Pune

Work from Office

ZSs Platform Development team designs, implements, tests and supports ZSs ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do : Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality products: automated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring : Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Understanding to Data Science Algorithms God to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

ZS's Insights & Analytics group partners with clients to design and deliver solutions to help them tackle a broad range of business challenges. Our teams work on multiple projects simultaneously, leveraging advanced data analytics and problem-solving techniques. Our recommendations and solutions are based on rigorous research and analysis underpinned by deep expertise and thought leadership. What Youll Do: Develop advanced and efficientstatistically effective algorithms that solve problems of highdimensionality. Utilizetechnical skills such as hypothesis testing, machine learning and retrieval processes to apply statistical and data mining techniques toidentify trends, createfigures, and analyze other relevantinformation. Collaborate with clients and other stakeholders at ZS to integrate and effectively communicate analysis findings. Contribute to the assessment of emerging datasets and technologies thatimpact our analytical platform. What Youll Bring: Amasters degree in computer science, Statistics, or a relevant field; A robust academic performance history with coursework emphasizing analysis and quantitative skills. A knowledge of big data, advanced analytical concepts, and algorithms (e.g., text mining, social listening, recommender systems, predictive modeling, etc.). A proficiency in at least one programming language (e.g., Java/Python/R). Experience with tools/platforms such as the Hadoop eco system, Amazon Web Services or database systems. Fluency in English.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Pune

Work from Office

about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZSs Platform Development team designs, implements, tests and supports ZSs ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do: As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality products: automated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring: 1-3 years of experience in developing software, ideally building SaaS products and services Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Good hands on to work with AWS services (EC2, EMR, S3, Serverless stack, RDS, Sagemaker, IAM, EKS etc) Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Good to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required

Posted 1 week ago

Apply

1.0 - 6.0 years

8 - 13 Lacs

Pune

Work from Office

Azure Data Engineer JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22756 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelors degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Gurugram, Chennai, Bengaluru

Work from Office

What Youll Do Develop and apply advanced statistical models that help clients understand dynamic business issues. Leverage analytic techniques to use data to guide client and ZS team decision-making. Design custom analyses in R, Tableau, SAS, Visual Basic and Excel to investigate and inform client needs. Synthesize and communicate results to clients and ZS teams through oral and written presentations. Develop client relationships and serve as key point of contact on aspects of projects. Provide client and ZS teams project status updates. Create project deliverables and implement solutions. Advance problem-solving skills and improve ZSs capabilities. Guide and mentor Associates on teams. What Youll Bring Bachelor's or master's degree required in any discipline with strong record of academic success in quantitative and analytic coursework such as operations research, applied mathematics, management science, data science, statistics, econometrics or engineering. Up to 3 years of relevant post-collegiate job experience. Fluency in English. Knowledge of programming (e.g., Java/Python/R). Exposure to tools/platforms (e.g., Hadoop eco system and database systems). Demonstrated proficiency in a programming language or analytic tool such as R, SAS, Tableau, or VBA. High motivation, good work ethic, maturity, and personal initiative. Effective oral and written communication skills.

Posted 1 week ago

Apply

8.0 years

3 - 9 Lacs

Hyderābād

On-site

Senior Data Scientist Hyderabad, Telangana, India Date posted Jul 08, 2025 Job number 1844017 Work site Microsoft on-site only Travel 0-25 % Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. We are looking for a Senior Data Scientist to join our Detection Engineering team and lead the development of AI/ML models that enhance the efficiency and impact of Microsoft’s Security Operations Center (SOC). In this role, you will drive the design and optimization of scalable, data-driven solutions that transform massive security signal data into actionable intelligence. You’ll bring deep technical expertise to guide model architecture, platform integration, and best practices in security engineering, while collaborating closely with analysts, responders, and engineers to align on goals, scope, and strategy. Beyond technical leadership, you’ll continuously evolve our advanced detection frameworks to improve accuracy, reduce false positives, and stay ahead of emerging threats. You’ll mentor early-in-career engineers, foster a culture of learning and innovation, and contribute to a strong, inclusive team environment grounded in Microsoft’s values. If you’re passionate about applying data science to real-world security challenges and thrive in a fast-paced, mission-driven space, we’d love to hear from you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. In alignment with our Microsoft values, we are committed to cultivating an inclusive work environment for all employees to positively impact our culture every day. Qualifications 8+ years of experience in Data Science, machine learning, natural language processing, and deep learning preferably with a focus on Cyber Security or related fields. Experience in programming languages such as Python, R, or Scala, with hands-on experience in data analysis, experimental design principles and visualization. Experience in translating complex data into actionable insights and recommendations that drive business impact. Excellent technical design skills and proven ability to drive large scale system designs for complex projects or products. Expertise in machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn and others). In-depth knowledge of cybersecurity principles, threats, and attack vectors. Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and data processing. Strong analytical and problem-solving skills with the ability to think creatively. Excellent communication skills with the ability to explain complex concepts stakeholders. Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results) Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 8+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results) or equivalent experience. Preferred Qualifications: Experience in developing and deploying machine learning models for Cyber security applications. Experience in Big Data preferably in the cybersecurity or SaaS industry. Experience with data science workloads with the Azure tech stack; Synapse, Azure ML, etc. Knowledge of anomaly detection, fraud detection, and other related areas. Familiarity with security fundamentals and attack vectors Publications or contributions to the field of data science or cybersecurity. Excellent track record of cross team collaboration. Ambitious, self-motivated. Agile, can-do attitude and great at dealing with ambiguity. Responsibilities Develop and implement machine learning models and algorithms to detect security threats and attacks within Microsoft. Analyse large and complex datasets generated to identify patterns and anomalies indicative of security risks. Collaborate with security experts to understand threat landscapes and incorporate domain knowledge into models. Continuously monitor and improve the performance of security models to adapt to evolving threats. Lead the design and implementation of data-driven security solutions and tools. Mentor and guide junior data scientists in best practices and advanced techniques. Communicate findings and insights to stakeholders, including senior leadership and technical teams. Stay up to date with the latest advancements in data science, machine learning, and cybersecurity. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Job Description The Data Analyst is responsible for developing and obtaining policies and management of procedures. The analyst is a key global role in collaboration with all functions . He/she Conveys specific, observable, and/or measurable expectations for each assignment, and verifies understanding and agreement on deliverables and timeframes. Being up to date with technologies and new solutions helps the Data analyst to always be a tech savvy. Work closely with business stakeholders and transpose the needs to Data & Analytics peers: Data Architect, Data Lake, Business Analytics. Key Job Areas Of Responsibilities Develop records management processes and policies. Identify areas to increase efficiency and automation of processes set up and maintain automated data processes; Identify, evaluate and implement external services and tools to support data validation and cleansing; Cross team collaboration; Data modelling; Data mining; Conveys specific, observable, and/or measurable expectations for each assignment, and verifies understanding and agreement on deliverables and timeframes ; Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisiveness ; Technology upgrade oversight; Pattern analysis; Machine learning solutions; Produce and track key performance indicators; Develop and support reporting processes; Monitor and audit data quality & accuracy; Liaise with internal and external clients to fully understand data content gather, understand and document detailed business requirements using appropriate tools and techniques; Education / Qualifications Bachelors degree in one of the following: Computer Science Engineering /Applied Mathematics / Economics / related field and related work experience of minimum 3 years. Experience Required Experience with modelling and development ; Experience bringing prototypes to production on Hadoop or NoSQL platforms ; Experience with visualization software (Tableau, Power BI) & Business Analysis management tools (Power Designer, etc) ; Key Skills And Knowledge Fluent in English Excellent numerical and analytical skills; Understanding of best-in-class model and data configuration and development processes ; Excellent collaboration and negotiation skills ; Experience working with remote and global teams ; Knowledge of data modelling, data cleansing, and data enrichment techniques; Knowledge of data analysis tools; The capacity to develop and document procedures and workflows; The ability to carry out data quality control, validation and linkage an understanding of data protection issues; The ability to produce clear graphical representations and data visualizations. About Us Garrett is a cutting-edge technology leader delivering differentiated solutions for emission reduction and energy efficiency. We are passionate about innovating for mobility and beyond. With a nearly 70-year legacy, we serve customer worldwide with passenger vehicle, commercial vehicle, aftermarket replacement, and performance enhancement solutions. About The Team The Garrett Information Technology (IT) team focuses on understanding the business, market challenges and new technologies to deliver competitive and innovative services that make ou r business more flexible both today and in the future

Posted 1 week ago

Apply

0 years

2 Lacs

Gurgaon

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Senior Data Scientist AI Garage is responsible for establishing Mastercard as an AI powerhouse. AI will be leveraged and implemented at scale within Mastercard providing a foundational, competitive advantage for the future. All internal processes, all products and services will be enabled by AI continuously advancing our value proposition, consumer experience, and efficiency. Opportunity Join Mastercard's AI Garage @ Gurgaon, a newly created strategic business unit executing on identified use cases for product optimization and operational efficiency securing Mastercard's competitive advantage through all things AI. The AI professional will be responsible for the creative application and execution of AI use cases, working collaboratively with other AI professionals and business stakeholders to effectively drive the AI mandate. Role Ensure all AI solution development is in line with industry standards for data management and privacy compliance including the collection, use, storage, access, retention, output, reporting, and quality of data at Mastercard Adopt a pragmatic approach to AI, capable of articulating complex technical requirements in a manner this is simple and relevant to stakeholder use cases Gather relevant information to define the business problem interfacing with global stakeholders Creative thinker capable of linking AI methodologies to identified business challenges Identify commonalities amongst use cases enabling a microservice approach to scaling AI at Mastercard, building reusable, multi-purpose models Develop AI/ML solutions/applications leveraging the latest industry and academic advancements Leverage open and closed source technologies to solve business problems Ability to work cross-functionally, and across borders drawing on a broader team of colleagues to effectively execute the AI agenda Partner with technical teams to implement developed solutions/applications in production environment Support a learning culture continuously advancing AI capabilities All About You Experience Experience in the Data Sciences field with a focus on AI strategy and execution and developing solutions from scratch Demonstrated passion for AI competing in sponsored challenges such as Kaggle Previous experience with or exposure to: o Deep Learning algorithm techniques, open source tools and technologies, statistical tools, and programming environments such as Python, R, and SQL o Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning o Classical Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Recommender Systems - Collaborative Filtering, FPMC, FISM, Fossil o Deep Learning algorithm techniques like Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks – Feedforward, CNN, LSTM’s GRU’s is a plus. Optimization techniques – Activity regularization (L1 and L2), Adam, Adagrad, Adadelta concepts; Cost Functions in Neural Nets – Contrastive Loss, Hinge Loss, Binary Cross entropy, Categorical Cross entropy; developed applications in KRR, NLP, Speech and Image processing o Deep Learning frameworks for Production Systems like Tensorflow, Keras (for RPD and neural net architecture evaluation), PyTorch and Xgboost, Caffe, and Theono is a plus Exposure or experience using collaboration tools such as: o Confluence (Documentation) o Bitbucket/Stash (Code Sharing) o Shared Folders (File Sharing) o ALM (Project Management) Knowledge of payments industry a plus Experience with SAFe (Scaled Agile Framework) process is a plus Effectiveness Effective at managing and validating assumptions with key stakeholders in compressed timeframes, without hampering development momentum Capable of navigating a complex organization in a relentless pursuit of answers and clarity Enthusiasm for Data Sciences embracing the creative application of AI techniques to improve an organization's effectiveness Ability to understand technical system architecture and overarching function along with interdependency elements, as well as anticipate challenges for immediate remediation Ability to unpack complex problems into addressable segments and evaluate AI methods most applicable to addressing the segment Incredible attention to detail and focus instilling confidence without qualification in developed solutions Core Capabilities Strong written and oral communication skills Strong project management skills Concentration in Computer Science Some international travel required Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 1 week ago

Apply

0 years

2 Lacs

Gurgaon

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Manager Data Scientist AI Garage is responsible for establishing Mastercard as an AI powerhouse. AI will be leveraged and implemented at scale within Mastercard providing a foundational, competitive advantage for the future. All internal processes, all products and services will be enabled by AI continuously advancing our value proposition, consumer experience, and efficiency. Opportunity Join Mastercard's AI Garage @ Gurgaon, a newly created strategic business unit executing on identified use cases for product optimization and operational efficiency securing Mastercard's competitive advantage through all things AI. The AI professional will be responsible for the creative application and execution of AI use cases, working collaboratively with other AI professionals and business stakeholders to effectively drive the AI mandate. Role Ensure all AI solution development is in line with industry standards for data management and privacy compliance including the collection, use, storage, access, retention, output, reporting, and quality of data at Mastercard Adopt a pragmatic approach to AI, capable of articulating complex technical requirements in a manner this is simple and relevant to stakeholder use cases Gather relevant information to define the business problem interfacing with global stakeholders Creative thinker capable of linking AI methodologies to identified business challenges Identify commonalities amongst use cases enabling a microservice approach to scaling AI at Mastercard, building reusable, multi-purpose models Develop AI/ML solutions/applications leveraging the latest industry and academic advancements Leverage open and closed source technologies to solve business problems Ability to work cross-functionally, and across borders drawing on a broader team of colleagues to effectively execute the AI agenda Partner with technical teams to implement developed solutions/applications in production environment Support a learning culture continuously advancing AI capabilities All About You Experience Experience in the Data Sciences field with a focus on AI strategy and execution and developing solutions from scratch Demonstrated passion for AI competing in sponsored challenges such as Kaggle Previous experience with or exposure to: o Deep Learning algorithm techniques, open source tools and technologies, statistical tools, and programming environments such as Python, R, and SQL o Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning o Classical Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Recommender Systems - Collaborative Filtering, FPMC, FISM, Fossil o Deep Learning algorithm techniques like Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks – Feedforward, CNN, LSTM’s GRU’s is a plus. Optimization techniques – Activity regularization (L1 and L2), Adam, Adagrad, Adadelta concepts; Cost Functions in Neural Nets – Contrastive Loss, Hinge Loss, Binary Cross entropy, Categorical Cross entropy; developed applications in KRR, NLP, Speech and Image processing o Deep Learning frameworks for Production Systems like Tensorflow, Keras (for RPD and neural net architecture evaluation), PyTorch and Xgboost, Caffe, and Theono is a plus Exposure or experience using collaboration tools such as: o Confluence (Documentation) o Bitbucket/Stash (Code Sharing) o Shared Folders (File Sharing) o ALM (Project Management) Knowledge of payments industry a plus Experience with SAFe (Scaled Agile Framework) process is a plus Effectiveness Effective at managing and validating assumptions with key stakeholders in compressed timeframes, without hampering development momentum Capable of navigating a complex organization in a relentless pursuit of answers and clarity Enthusiasm for Data Sciences embracing the creative application of AI techniques to improve an organization's effectiveness Ability to understand technical system architecture and overarching function along with interdependency elements, as well as anticipate challenges for immediate remediation Ability to unpack complex problems into addressable segments and evaluate AI methods most applicable to addressing the segment Incredible attention to detail and focus instilling confidence without qualification in developed solutions Core Capabilities Strong written and oral communication skills Strong project management skills Concentration in Computer Science Some international travel required #AI1 Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

0 years

6 - 9 Lacs

Gurgaon

On-site

Responsibilities: This function covers incumbents responsible for various data activities, which include data analysis, maintenance, data quality and continuous interaction with business users to understand the requirements and convert those to the needed codes. Understanding of marketing data/Retail line of business is a plus. Day-to-day actions are focused on creating SAS codes to audit campaign data, execute campaigns ,identify deviations and analyze the correctness of the same. BAU also include reports being created and provided to business users for Retail line of business using SAS , Excel and planned migration to Tableau or equivalent approved reporting Tool. Knowledge of Autosys and Service now is an add on. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Technology Stack : Previous experience on SAS (intermediate-Expert) for creating reports /complex data sets Excel, Tableau/Equivalent reporting tool, Beginner/intermediate knowledge of: Python/Pyspark and Hadoop/Hive High attention to detail and analytical skills Logical approach to problem solving and good written and verbal communication skills - Job Family Group: Decision Management - Job Family: Data/Information Management - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

4.0 - 7.0 years

5 - 8 Lacs

Noida

On-site

Expertise in AWS services like EC2, CloudFormation, S3, IAM, ECS/EKS, EMR, QuickSight, SageMaker, Athena, Glue etc. Expertise in Hadoop platform administration and good debugging skills to resolve hive and spark related issues. Experience in designing, developing, configuring, testing and deploying cloud automation preferably in AWS Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in Python and Spark. Working knowledge of CI/CD tools and containers Key Responsibilities Interpreting and analyzing business requirements and converting them into high and low level designs. Designing, developing, configuring, testing and deploying cloud automation for Finance business unit using tools such as CloudFormation, Terraform, Ansible etc. while following the capability domain’s Engineering standards in an Agile environment End-to-end ownership of developing, configuring, unit testing and deploying developed code with quality and minimal supervision. Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards. Understand and follow-up through change management procedures to implement project deliverables. Coordinate with support groups such as Enterprise Cloud Engineering teams, DevSecOps, Monitoring to get issues resolved with a quick turnaround time. Work with data science user community to address an issue in ML(machine learning) development life cycle. Required Qualifications Bachelor’s or Master’s degree in Computer Science or similar field 4 to 7 years of experience in automation on a major cloud (AWS, Azure or GCP) Experience in infrastructure provisioning using Ansible, AWS Cloud formation or Terraform, Python or PowerShell Working knowledge of AWS Services such as EC2, Cloud Formation, IAM, S3, EMR, ECS/EKS etc. Working knowledge of CI/CD tools and containers. Experience in hadoop administration in resolving hive/spark related issues. Proven understanding of common development tools, patterns and practices for the cloud. Experience writing automated unit tests in a major programming language Proven ability to write quality code by following best practices and guidelines. Strong problem-solving, multi-tasking and organizational skills. Good written and verbal communication skills. Demonstrable experience of working on a team that is geographically dispersed. Preferred Qualifications Experience with managing Hadoop platform and good in debugging hive/spark related issues. Cloud certification (AWS, Azure or GCP) Knowledge of UNIX/LINUX shell scripting About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 1 week ago

Apply

4.0 - 7.0 years

8 - 9 Lacs

Noida

On-site

Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools Job Location

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position - Technical Architect Location - Pune Experience - 6+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. JOB TITLE - Technical Architect B.E/B.Tech, MCA, M.E/M.Tech graduate with 6 -10 Years of experience (This includes 4 years of experience as an application architect or data architect) • Java/Python/UI/DE • GCP/AWS/AZURE • Generative AI-enabled application design pattern knowledge is a value addition. • Excellent technical background with a breadth of knowledge across analytics, cloud architecture, distributed applications, integration, API design, etc • Experience in technology stack selection and the definition of solution, technology, and integration architectures for small to mid-sized applications and cloud-hosted platforms. • Strong understanding of various design and architecture patterns. • Strong experience in developing scalable architecture. • Experience implementing and governing software engineering processes, practices, tools, and standards for development teams. • Proficient in effort estimation techniques; will actively support project managers and scrum masters in planning the implementation and will work with test leads on the definition of an appropriate test strategy for the realization of a quality solution. • Extensive experience as a technology/ engineering subject matter expert i. e. high level • Solution definition, sizing, and RFI/RFP responses. • Aware of the latest technology trends, engineering processes, practices, and metrics. • Architecture experience with PAAS and SAAS platforms hosted on Azure AWS or GCP. • Infrastructure sizing and design experience for on-premise and cloud-hosted platforms. • Ability to understand the business domain & requirements and map them to technical solutions. • Outstanding interpersonal skills. Ability to connect and present to CXOs from client organizations. • Strong leadership, business communication consulting, and presentation skills. • Positive, service-oriented personality OVERVIEW OF THE ROLE: This role serves as a paradigm for the application of team software development processes and deployment procedures. Additionally, the incumbent actively contributes to the establishment of best practices and methodologies within the team. Craft & deploy resilient APIs, bridging cloud infrastructure & software development with seamless API design, development, & deployment • Works at the intersection of infrastructure and software engineering by designing and deploying data and pipeline management frameworks built on top of open-source components, including Hadoop, Hive, Spark, HBase, Kafka streaming, Tableau, Airflow, and other cloud-based data engineering services like S3, Redshift, Athena, Kinesis, etc. • Collaborate with various teams to build and maintain the most innovative, reliable, secure, and cost-effective distributed solutions. • Design and develop big data and real-time analytics and streaming solutions using industry-standard technologies. • Deliver the most complex and valuable components of an application on time as per the specifications. • Plays the role of a Team Lead, manages, or influences a large portion of an account or small project in its entirety, demonstrating an understanding of and consistently incorporating practical value with theoretical knowledge to make balanced technical decisions

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are the ACES Strategic team (Advanced Cloud Engineering & Supportability), a global engineering team in Azure CXP and we are focused on Strategic Azure Customers. We are customer-obsessed problem-solvers. We orchestrate and drive deep engagements in areas like Incident Management, Problem Management, Support, Resiliency, and empowering the customers. We represent the customer and amplify customer voice with Azure Engineering connecting to the quality vision for Azure. We innovate and find ways to scale our learning across our customer base. Diversity and inclusion are central to who we are, how we work, and what we enable our customers to achieve. We know that empowering our customers starts with empowering our team to show up authentically, work in ways that are best for them, and achieve their career goals. Every minute of every day, customers stake their entire business and reputation on the Microsoft Cloud. The Azure Customer Experience (CXP) team believes that when we meet our high standards for quality and reliability, our customers win. If we falter, our customers fail their end-customers. Our vision is to turn Microsoft Cloud customers into fans. Are you constantly customer-obsessed and passionate about solving complex technical problems? Do you take pride in enhancing customer experience through innovation? If the answer is Yes, then join us and surround yourself with people who are passionate about cloud computing and believe that extraordinary support is critical to customer success. As a customer focused Advanced Cloud Engineer, you are the primary engineering contact accountable for your customer’s support experience on Azure. You will drive resolution of critical and complex problems, support key customer projects on Azure and be the voice of the customer within Azure. In this role, you will work in partnership with Customer Success Account Managers, Cloud Solution Architects, Technical Support Engineers, and Azure engineering with our mission to turn Azure customers into fans with world-class engineering-led support experience. This role is flexible in that you can work up to 50% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Technically Oriented With minimal oversight, track customer incidents, engage with strategic customers and partners to understand issues, contribute to troubleshooting through diagnostics, communicate progress and next steps to customers with a focus on reducing time taken to mitigate critical incidents. Use engineering and support tools, customer telemetry and/or direct customer input to detect and flag issues in the products or with the customer usage of the products. Help customers stay current with best practices by sharing content. Identify and leverage developmental opportunities across product areas and business processes (e.g., mentorships, shadowing, trainings) for professional growth and to develop technical skills to resolve customer issues. Customer Solution Lifecycle Management With minimal guidance, serve as a connecting point between the product team and customers throughout the engagement life cycle, engage with customers to understand their business and availability needs, develop and offer proactive guidance on designing configurations and deploying solutions on Azure with support from subject matter experts. Handle critical escalations on customer issues from the customer or support or field teams, conduct impact analysis, help customers with answers to their technical questions, and serve as an escalation resource in areas of subject matter expertise. Conduct in-depth root cause analysis of issues and translates findings into opportunities for improvement and track and drive them as repair items. Relationship/Experience Management Act as the voice of customers and channel product feedback from strategic customers to product groups. Identify customer usage patterns and drive resolutions on reoccurring issues with product groups. Close the feedback loop with the customers on product features. With minimal guidance, partner with other teams (e.g., program managers, software engineers, product, customer service support teams), prioritize, unblock, and resolve critical customer issues. Collaborate with stakeholders to support delivery of solutions to strategic customers and resolving customer issues. Embody our culture and values. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor’s degree in engineering, Computer Science, or related field AND 6+ years of experience in Software industry experience related to technology OR equivalent experience. 4 years of demonstrated IT experience supporting and troubleshooting enterprise level, mission-critical applications resolving complex issues/situations and driving technical resolution across cross-functional organizations. 2+ years experience in an external customer / client facing role. 2+ years of experience working on cloud computing technologies. Experience with being on-call. Technical Skills: Cloud computing technologies. Demonstrated hands on experience in one or more of the following: Core IaaS: Compute, Storage, Networking, High Availability. Data Platform and Bigdata: SQL Server, Azure SQL DB, HDInsight/Hadoop, Machine Learning, Azure Stream Analytics, Azure Data Factory / Data Bricks. Azure PaaS Services: Redis Cache, Service Bus, Event Hub, Cloud Service, IoT suite, Mobile Apps, etc. Experience in Monitoring related technologies like Azure Monitor, Log Analytics, Resource Graph, Azure Alerts, Network Watcher, Grafana, Ambari, Prometheus, Datadog, Confluent, etc. Experience in deploying, configuring, and operating enterprise Monitoring solutions. Experience in one or more automation languages (PowerShell, Python, C#, Open Source). Communication skills: ability to empathize with customers and convey confidence. Able to explain highly technical issues to varied audiences. Able to prioritize and advocate customer’s needs to the proper channels. Take ownership and work towards a resolution. Customer Obsession: Passion for customers and focus on delivering the right customer experience. Growth Mindset: Openness and ability to learn new skills and technologies in a fast-paced environment. The ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments aws,hadoop,java Skills aws,hadoop,java

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio Join the team as our next Senior Machine Learning Engineer (L3) in our Comms Platform Engineering team About The Job This position is needed to scope, design, and deploy machine learning systems into the real world, the individual will closely partner with Product & Engineering teams to execute the roadmap for Twilio’s AI/ML products and services. Twilio is looking for a Senior Machine Learning engineer to join the rapidly growing Comms Platform Engineering team of our Messaging business unit. You will understand the needs of our customers and build data products that solve their needs at a global scale. Working side by side with other engineering teams and product counterparts, you will own end-to-end execution of ML solutions. To thrive in this role, you must have a background in ML engineering, and a track record of solving data & machine-learning problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization Responsibilities In this role, you’ll: Build and maintain scalable machine learning solutions in production Train and validate both deep learning-based and statistical-based models considering use-case, complexity, performance, and robustness Demonstrate end-to-end understanding of applications and develop a deep understanding of the “why” behind our models & systems Partner with product managers, tech leads, and stakeholders to analyze business problems, clarify requirements and define the scope of the systems needed Work closely with data platform teams to build robust scalable batch and realtime data pipelines Work closely with software engineers, build tools to enhance productivity and to ship and maintain ML models Drive engineering best practices around code reviews, automated testing and monitoring Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 5+ years of applied ML experience. Proficiency in Python is preferred. We will also consider strong quantitative candidates with a background in other programming languages Strong background in the foundations of machine learning and building blocks of modern deep learning Track record of building, shipping and maintaining machine learning models in production in an ambiguous and fast paced environment. You have a clear understanding of frameworks like - PyTorch, TensorFlow, or Keras, why and how these frameworks do what they do Familiarity with ML Ops concepts related to testing and maintaining models in production such as testing, retraining, and monitoring. Demonstrated ability to ramp up, understand, and operate effectively in new application / business domains. You’ve explored some of the modern data storage, messaging, and processing tools (Kafka, Apache Spark, Hadoop, Presto, DynamoDB etc.) Experience working in an agile team environment with changing priorities Experience of working on AWS Desired Experience with Large Language Models Location This role will be remote, and based in India (only in Karnataka, TamilNadu, Maharashtra, Telangana and New Delhi). Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Hyderabad

Work from Office

What is the Director - Research Scientist AI & Optimization responsible for? The core mandate of this role is to bring innovative digital investment products and solutions to market, leveraging a patented and innovative digital WealthTech/FinTech product - Goals Optimization Engine (GOE) - built with several years of academic research in mathematical optimization, probability theory and AI techniques at its core. The mandate also extends to leveraging cutting edge AI, such as Generative AI, in addition to Reactive AI to create value within various business functions within Franklin Templeton such as Investment Solutions, Portfolio Management, Sales & Distribution, Marketing, and HR functions, among others, in a responsible and appropriate manner. The possibilities are limitless here and present a fantastic opportunity for a self-motivated and driven professional to make significant contributions to the organization and to themselves. What are the ongoing responsibilities of a Director - Research Scientist AI & Optimization? As a Principal Research Scientist - AI and Optimization, you will play a pivotal role in driving innovation, product research, and proof of concepts for our AI research and Goals Optimization Engine (GOE) product roadmap. You will be responsible for mentoring and guiding a team of highly motivated research scientists, creating intellectual property, and ensuring successful client deployments and product development. Key Responsibilities: Innovation, Product Research, Proof of Concepts, Pseudocode & Design (40%): Lead and contribute to the multi-year Goals Optimization Engine (GOE) product roadmap, conceptualizing fitment against various industry use cases, creating product variants, and designing new features and enhancements across multiple distribution lines and geographies Mentor and guide a team of research scientists to achieve common objectives Serve as the Subject Matter Expert (SME) for a specific domain within AI and/or Optimization, acting as the go-to person for all internal stakeholders Develop pseudocode and working prototypes in a Python environment, collaborating closely with Product Managers and Product Developers Create well-articulated design documents and presentations to explain research to internal and external stakeholders, including clients and partners located globally Lead industry research and evaluate partnerships with third-party vendors and specialized service providers where appropriate Maintain a thorough understanding of boundary conditions, regulatory environments, data challenges, technology integrations, algorithmic dependencies, and operational process nuances to ensure nothing slips through the cracks Stay up to date with the latest developments in the Investment Management industry, Financial Mathematics, Portfolio Construction, and Portfolio Management IP Creation, Paper Writing, and Thought Leadership (30%): Conceptualize and produce high-quality intellectual property for publication in top-tier academic and practitioner journals Peer-review the work of other research scientists and improve the outcome of their research output Create patent-worthy intellectual content, apply for patents, and win them Take responsibility for winning industry awards for exceptional product research and innovative work products Stay informed about the latest industry research and evaluate it objectively and in an unbiased manner Publish research works for conferences Client Deployment, Product Development, and Vendor Due Diligence (30%): Act as the SME in initial client deployment discussions, showcasing the rigor of research, explaining the product or solution concept, and engaging in discussions with similar individuals/teams from the client/partner side Contribute to product development by ensuring alignment with research and design Provide hands-on support where required to complete time-critical work successfully Engage with third-party vendors and potential integration partners to understand their capabilities, methodologies, and algorithms, and perform rigorous due diligence to make clear recommendations on Go/No-Go decisions What ideal qualifications, skills & experience would help someone to be successful? Education: Bachelor's and master's degree in STEM disciplines; a PhD in a relevant discipline (Optimization, Probability, Quant Finance, AI & ML, Computational Mathematics, Statistics, etc.) would be a + Relevant industry certifications Experience - Core Skills: 10+ years of applied R &D experience in research departments of reputed organizations post-Masters or PhD Track record of real innovation generating impact is essential Demonstrated ability to create intellectual content, publish papers, and obtain patents Ability to effectively bridge the gap between academia and practice, ensuring research is practical and implementable Structured thinking and exceptional mathematical skills Excellent team player with the ability to work with ambiguity and thrive in chaos Familiarity with ML/DL/DRL, NLP, especially Large Language Models, dynamic programming, and/or convex optimization Solid experience with AWS and/or Azure Familiar with Python, PySpark, SQL, Hadoop, C++ Experience - Other Soft Skills: Proven ability to take initiative and work under pressure in a changing, fast paced environment Exceptional decision-making skills, with the ability to prioritize across needs given limited resources Thrives in a startup-like environment: loves dealing with a fast pace and changing needs Ability to build relationships both inside and outside of the product organization Ability to narrate a story for a problem along with the capacity to dive into minute details Superlative communication and consensus-building skills Work Shift Timings - 2:00 PM - 11:00 PM IST

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Hybrid

Job Title: SDE 3 Senior Data Engineer Location: Bengaluru (Hybrid 3 days/week in office) Experience: 8-11 Years Type: Full-time Apply: Share your resume with the details listed below to vijay.s@xebia.com Availability: Immediate joiners or max 2 weeks' notice period only About the Role Xebia is looking for an experienced and hands-on SDE 3 Senior Data Engineer to lead the development of scalable data solutions. As a senior IC (individual contributor), you’ll influence architecture decisions, coach teams, and deliver high-performance data engineering systems for large-scale, enterprise environments. You’ll work across the full data lifecycle—from ingestion and storage to transformation and analytics—leveraging technologies like Spark, Scala, SQL, Cloud Native tools, and Hadoop , in a fast-paced, agile environment. Key Responsibilities Lead the design and implementation of data pipelines using Apache Spark and Scala Architect cloud-native, scalable, and fault-tolerant data platforms (Azure preferred) Drive development of streaming pipelines using Kafka/Event Hub/Spark Streaming Guide system design with a focus on scalability, low-latency, and performance Work on structured and unstructured data, Data Lakes, and Medallion Architecture Collaborate with stakeholders, mentor junior engineers, and lead Agile squads Implement best practices for CI/CD, containerization (Docker/Kubernetes), and orchestration (Airflow/Oozie) Must-Have Skills Apache Spark, Scala (or Java with strong preference for Scala) SQL and Data Structures, Query Optimization Hadoop and Distributed Systems Cloud-native architecture (Azure preferred) System Design, Big Data Design Patterns CI/CD: Git, Jenkins, Docker, Kubernetes Kafka/Structured Streaming Experience with NoSQL, Messaging Queues, Orchestration tools Good-to-Have Skills Apache Iceberg, Parquet, Ceph, Kafka Connect Experience with Data Governance tools: Alation, Collibra Data Lakes and Medallion Architecture Metadata Management, Master Data Management Data Quality and Lineage frameworks Why Xebia? At Xebia, you’ll work with passionate technologists solving large-scale problems using modern data stacks. We foster innovation, cross-functional learning, and continuous growth. Be part of a dynamic team that delivers real impact in data-driven enterprises. To Apply Please share your updated resume and include the following details in your email to vijay.s@xebia.com : Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Xebia Location: Bengaluru Notice Period / Last Working Day (if serving): Primary Skills: LinkedIn Profile URL: Note: Only candidates who can join immediately or within 2 weeks will be considered. Join Xebia and shape the future of data engineering in enterprise systems.

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Pune, Chennai, Bengaluru

Work from Office

Hiring for Abinitio Developer for our MNC Client at Pune/Chennai/Bangalore Exp - 4+ Yrs Notice - immediate - 15 Days joiner JD : AB Initio Proficiency/good understanding in at least couple of the following: Ab Initio, SQL, Data Modelling / relationships, RDBMS, Hadoop, Machine Learning algorithm Ensure consistency in approach, design, and output to deliver quality and concise design solution documentation. Lead best practice methodology for design Define, promote re-usable, extendible, scalable, maintainable solution considering tradeoff for cost vs benefit Communicate at all levels in a clear and credible way about the importance of solution design. Lead developers in the team and foster a shared vision for design. Knowledgeable and skilled at the practice of data driven design solutions. Expert communications to all levels of Barclays team, from design teams, build teams to leadership team. Including presenting work, negotiating timelines and workloads, inputting on the business strategy for data driven projects and products. This will include presentations and reports, often detailed, but easy to understand and convincing. Essential Skills Experience of Banking/financial industries a. Solution design for multiple large scale (1+mn) projects b. Stakeholder management including persons at working at higher grade c. Solid understanding of ETL, MI/BI, Data Warehouse Concepts, Data Structures and related technologies and solutions Interested can share cv at shivani.c@artechinfo.in or can call me at 9193549698 . Thanks Shivani Chaturvedi HR at Artech

Posted 1 week ago

Apply

10.0 - 14.0 years

20 - 30 Lacs

Noida, Pune, Bengaluru

Hybrid

Greetings from Infogain! We are having Immediate requirement for Big Data Engineer (Lead) position in Infogain India Pvt ltd. As a Big Data Engineer (Lead), you will be responsible for leading a team of big data engineers. You will work closely with clients and team members to understand their requirements and develop architectures that meet their needs. You will also be responsible for providing technical leadership and guidance to your team. Mode of Hiring-Permanent Skills : (Azure OR AWS) AND Apache Spark OR Hive OR Hadoop AND Spark Streaming OR Apache Flink OR Kafka AND NoSQL AND Shell OR Python. Exp: 10 to 14 years Location: Bangalore/Noida/Gurgaon/Pune/Mumbai/Kochi Notice period- Early joiner Educational Qualification: BE/BTech/MCA/M.tech Working Experience 12-15 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Experience in building streaming data pipeline using Apache Spark Structured Streaming or Apache Flink on Kafka & Delta Lake Knowledge of NOSQL databases. Good to have experience in Cosmos DB, Restful APIs and GraphQL Knowledge of Big data ETL processing tools, Data modelling and Data mapping. Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Can share CV @ arti.sharma@infogain.com Total Exp Experience- Relevant Experience in Big data Relevant Exp in AWS OR Azure Cloud- Current CTC- Exp CTC- Current location - Ok for Bangalore location-

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

Job Title: Data Engineer (Remote) Working Hours: 4-hour overlap with EST (9 AM–1 PM) Type: Full-Time | Department: Engineering We’re hiring skilled Data Engineers to join our remote tech team. You'll develop scalable, cloud-based data products and lead small teams to deliver high-impact solutions. Ideal candidates bring deep technical expertise and a passion for innovation. Key Responsibilities: Build and optimize scalable data systems and pipelines Design APIs for data integration Lead a small development team, conduct code reviews, mentor juniors Collaborate with cross-functional teams Contribute to architecture and system design Must-Have Skills: 8+ years in Linux, Bash, Python, SQL 4+ years in Spark, Hadoop ecosystem 4+ years with AWS (EMR, Glue, Athena, Redshift) Team leadership experience Preferred: Experience with dbt, Airflow, Hive, data cataloging tools Knowledge of GCP, scalable pipelines, data partitioning/clustering BS/MS/PhD in CS or equivalent experience

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies