Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS!!! TCS is hiring for Big Data Architect Location - PAN India Years of Experience - 9-14 years Job Description- Experience with Python, Spark, and Hive data pipelines using ETL processes Apache Hadoop development and implementation Experience with streaming frameworks such as Kafka Hands on experience in Azure/AWS/Google data services Work with big data technologies (Spark, Hadoop, BigQuery, Databricks) for data preprocessing and feature engineering. Show more Show less
Posted 4 days ago
15.0 years
0 Lacs
India
On-site
SAS Solution Designer We are seeking a highly experienced SAS Solution Designer to join our team in a solution engineering lead capacity. This role requires in-depth knowledge of SAS technologies, cloud-based platforms, and data solutions. The ideal candidate will be responsible for end-to-end solution design aligned with enterprise architecture standards and business objectives, providing technical leadership across squads and development teams. Mitra AI is currently looking for experienced SAS Solution Designers who are based in India and are open to relocating. This is a hybrid opportunity in Sydney, Australia. JOB SPECIFIC DUTIES & RESPONSIBILITIES Own and define the end-to-end solution architecture for data platforms, ensuring alignment with business objectives, enterprise standards, and architectural best practices. Design reliable, stable, and scalable SAS-based solutions that support long-term operational effectiveness. Lead solution engineers and Agile squads to ensure the delivery of high-quality, maintainable data solutions. Collaborate independently with business and technical stakeholders to understand requirements and translate them into comprehensive technical designs. Provide high-level estimates for proposed features and technical initiatives to support business planning and prioritization. Conduct and participate in solution governance forums to secure approval for data designs and strategies. Drive continuous improvement by identifying technical gaps and implementing best practices, emerging technologies, and enhanced processes. Facilitate work breakdown sessions and actively participate in Agile ceremonies such as sprint planning and backlog grooming. Ensure quality assurance through rigorous code reviews, test case validation, and enforcement of coding and documentation standards. Troubleshoot complex issues by performing root cause analysis, log reviews, and coordination with relevant teams for resolution. Provide mentoring and coaching to solution engineers and technical leads to support skills growth and consistency in solution delivery. REQUIRED COMPETENCIES AND SKILLS Deep expertise in SAS technologies and ecosystem. Strong proficiency in cloud-based technologies and data platforms (e.g., Azure, Hadoop, Teradata). Solid understanding of RDBMS, ETL/ELT tools (e.g., Informatica), and real-time data streaming. Ability to work across relational and NoSQL databases and integrate with various data and analytics tools. Familiarity with BI and reporting tools such as Tableau and Power BI. Experience guiding Agile delivery teams, supporting full-stack solution development through DevOps and CI/CD practices. Capability to define and implement secure, scalable, and performant data solutions. Strong knowledge of metadata management, reference data, and data lineage concepts. Ability to communicate effectively with both technical and non-technical stakeholders. Problem-solving mindset with attention to detail and an emphasis on delivering high-quality solutions. REQUIRED EXPERIENCE AND QUALIFICATIONS Minimum of 15+ years of experience in solution design and development roles, including leadership responsibilities. Strong exposure to SAS and enterprise data platforms in the financial services industry. Prior experience working within risk, compliance, or credit risk domains is highly desirable. Practical experience with Agile methodologies and DevOps principles. Bachelors or Masters degree in Computer Science, Engineering, Information Technology, or related field. Experience working in cross-functional teams with a focus on business alignment and technology delivery. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
India
Remote
Job Title: Data Engineer Location: Remote Experience :5+Years Job Summary: We are seeking a highly skilled Data Engineer with strong experience in ETL development, data replication, and cloud data integration to join our remote team. The ideal candidate will be proficient in Talend , have hands-on experience with IBM Data Replicator and Qlik Replicate , and demonstrate deep knowledge of Snowflake architecture, CDC processes, and data transformation scripting. Key Responsibilities: Design, develop, and maintain robust ETL pipelines using Talend integrated with Snowflake . Implement and manage real-time data replication solutions using IBM Data Replicator and Qlik Replicate . Work with complex data source systems including DB2 (containerized and traditional) , Oracle , and Hadoop . Model and manage slowly changing dimensions ( Type 2 SCD ) in Snowflake. Optimize data pipelines for scalability, reliability, and performance. Design and implement Change Data Capture (CDC) strategies to support real-time and incremental data flows. Write efficient and maintainable code in SQL , Python , or Shell to support data transformations and automation. Collaborate with data architects, analysts, and other engineers to support data-driven initiatives. Required Skills & Qualifications: Strong proficiency in Talend ETL development and integration with Snowflake . Practical experience with IBM Data Replicator and Qlik Replicate . In-depth understanding of Snowflake architecture and Type 2 SCD data modeling. Familiarity with containerized environments and various data sources such as DB2 , Oracle , and Hadoop . Experience implementing CDC and real-time data replication patterns. Proficiency in SQL , Python , and Shell scripting . Excellent problem-solving and communication skills. Self-motivated and comfortable working independently in a fully remote environment. Preferred Qualifications: Snowflake certification or Talend certification. Experience working in an Agile or DevOps environment. Familiarity with data governance and data quality best practices. Show more Show less
Posted 4 days ago
10.0 - 15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position- Cloudera Data Engineer Location- Chennai Notice period- 0-30 Days/ Immediately Joiners Experience-10 to 15 years Cloudera Data Engineer will likely focus on designing, building, and maintaining scalable data pipelines and platforms within the Cloudera Hadoop ecosystem. Key skills include expertise in data warehousing, ETL processes, and strong programming abilities in languages like Python and SQL. They'll also need to be proficient in Cloudera tools, including Spark, Hive, and potentially Airflow for orchestration. Thank you Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
TEKsystems is seeking a Senior AWS + Data Engineer to join our dynamic team. The ideal candidate should have expertise Data engineer + Hadoop + Scala/Python with AWS services. This role involves designing, developing, and maintaining scalable and reliable software solutions. Job Title: Data Engineer – Spark/Scala (Batch Processing) Location: Manyata- Hybrid Experience: 7+yrs Type: Full-Time Mandatory Skills: 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing. Experience in building end-to-end solutions with the Big data platform, Spark or scala programming. 5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL. Experience in Big data platform for ETL development with AWS cloud platform. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue , EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis. Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive Analytics & BI Architecture appreciation and broad experience across all technology disciplines. Experience in working within a Data Delivery Life Cycle framework & Agile Methodology. Extensive experience in large enterprise environments handling large volume of datasets with High SLAs Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts. Well versed in AI Powered Engineering tools like Cline, GitHub Copilo Please send the resumes to nvaseemuddin@teksystems.com or kebhat@teksystems.com Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra
Remote
Solution Engineer - Data & AI Mumbai, Maharashtra, India Date posted Jun 16, 2025 Job number 1830869 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. OR 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 4 days ago
4.0 years
0 Lacs
Gurugram, Haryana
Remote
Solution Engineer - Cloud & Data AI Gurgaon, Haryana, India Date posted Jun 16, 2025 Job number 1830866 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 4 days ago
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience. Experience in automating infrastructure provisioning, Developer Operations (DevOps), integration, or delivery. Experience in networking, compute infrastructure (e.g., servers, databases, firewalls, load balancers) and architecting, developing, or maintaining cloud solutions in virtualized environments. Experience in scripting with Terraform and Networking, DevOps, Security, Compute, Storage, Hadoop, Kubernetes, or Site Reliability Engineering. Preferred qualifications: Certification in Cloud with experience in Kubernetes, Google Kubernetes Engine, or similar. Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations. Experience with IT security practices like identity and access management, data protection, encryption, certificate and key management. Experience with Google Cloud Platform (GCP) techniques like prompt engineering, dual encoders, and embedding vectors. Experience in building prototypes or applications. Experience in one or more of the following disciplines: software development, managing operating system environments (Linux or related), network design and deployment, databases, storage systems. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Provide domain expertise in cloud platforms and infrastructure to solve cloud platform tests. Work with customers to design and implement cloud based technical architectures, migration approaches, and application optimizations that enable business objectives. Be a technical advisor and perform troubleshooting to resolve technical tests for customers. Create and deliver best practice recommendations, tutorials, blog articles, and sample code. Travel up to 30% for in-region for meetings, technical reviews, and onsite delivery activities. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
Posted 4 days ago
0.0 - 6.0 years
0 Lacs
Ahmedabad, Gujarat
On-site
Join Our Innovation-Driven Team at NexusLink! Position: AI/ML Engineer – Team Lead Location: Naroda, Ahmedabad, Gujarat Experience: 4 to 6 Years Employment Type: Full-Time About Us At NexusLink , we are committed to building smart, scalable, and impactful digital solutions that empower businesses through innovation. As part of our fast-growing technology team, you’ll have the opportunity to lead ground breaking AI/ML projects and drive the next wave of intelligent automation. Role Overview We are looking for a skilled and forward-thinking AI/ML Engineer – Team Lead who can take ownership of our AI initiatives, guide a talented team, and deliver high-impact machine learning solutions. Key Responsibilities Lead and mentor a team of AI/ML engineers and data scientists. Design and deploy production-grade machine learning models across domains. Work closely with cross-functional teams to translate business needs into ML solutions. Review and optimize model performance, data pipelines, and scalability. Stay ahead of industry trends, tools, and frameworks to continuously enhance our AI capabilities. Drive experimentation, innovation, and implementation of AI best practices. ✅ What We’re Looking For Bachelor’s/Master’s degree in Computer Science, AI, Data Science, or a related field. 4 to 6 years of experience in machine learning and AI application development. Proficiency in Python and ML libraries such as TensorFlow, PyTorch, scikit-learn, etc. Strong understanding of supervised/unsupervised learning, NLP, and computer vision. Experience with model deployment using cloud platforms (AWS, GCP, or Azure). Team leadership experience with excellent communication and mentoring skills. Familiarity with MLOps, version control, and DevOps tools is a plus. Good to Have Experience with containerization (Docker, Kubernetes). Exposure to big data tools like Spark or Hadoop. Participation in AI/ML research or open-source contributions. Why Work with NexusLink? Innovative and open culture with a tech-first mindset. Lead real-world AI implementations. Flexible work environment with continuous learning and upskilling opportunities. Competitive pay and fast-track growth opportunities. How to Apply Ready to lead the AI revolution? Send your resume to [ ansuya.ghosh@nexuslinkservices.in ] with the subject line: AI/ML Engineer – Team Lead Application .
Posted 4 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience. Experience in automating infrastructure provisioning, Developer Operations (DevOps), integration, or delivery. Experience in networking, compute infrastructure (e.g., servers, databases, firewalls, load balancers) and architecting, developing, or maintaining cloud solutions in virtualized environments. Experience in scripting with Terraform and Networking, DevOps, Security, Compute, Storage, Hadoop, Kubernetes, or Site Reliability Engineering. Preferred qualifications: Certification in Cloud with experience in Kubernetes, Google Kubernetes Engine, or similar. Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations. Experience with IT security practices like identity and access management, data protection, encryption, certificate and key management. Experience with Google Cloud Platform (GCP) techniques like prompt engineering, dual encoders, and embedding vectors. Experience in building prototypes or applications. Experience in one or more of the following disciplines: software development, managing operating system environments (Linux or related), network design and deployment, databases, storage systems. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Provide domain expertise in cloud platforms and infrastructure to solve cloud platform tests. Work with customers to design and implement cloud based technical architectures, migration approaches, and application optimizations that enable business objectives. Be a technical advisor and perform troubleshooting to resolve technical tests for customers. Create and deliver best practice recommendations, tutorials, blog articles, and sample code. Travel up to 30% for in-region for meetings, technical reviews, and onsite delivery activities. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
Posted 4 days ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. About The Role We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. You Will Have An Opportunity To Work in a scrum team to design and build high-quality customer-facing software. Provide hands on technical leadership, mentoring and ensure a great user experience. Write unit, functional and end-to-end tests using mocha, chai, sinon, karateJS & codeceptJS. Help design our architecture and set code standards for ReactJS & NodeJS development. Gain product knowledge by successfully developing features for our applications. Communicate effectively with stakeholders, peers, and others. What We’re Looking For 8-10 years of development experience developing complex, scalable web-based applications. Experienced in test driven development, continuous integration, and continuous delivery. Min 6+ years of extensive MERN/MEVN (MongoDB, ExpressJS, ReactJS/VueJS and NodeJS) stack hands-on development experience. NodeJS primary with either ReactJS/VueJS/ExpressJS exposure. Experience in Electron, C++ and/or Objective C Possess good problem solving and analytical skills. Hands-on in designing and defining database schema using RDBMS and NoSQL databases. Experience in working in Agile development environment. Experience with web services, REST API and micro services. Experience with Amazon AWS services & real time data analytics technology (Hadoop, Spark, Kinesis, etc.) Experience with GIT, bitbucket, or GitHub and the Features branching workflow. Awesome Written and Oral communication skills and ability to work in a global and distributed environment with agility to mold communication for different audiences. Knowledge or experience with Github co-pilot. Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. * Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data •Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL, Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 6 to 9 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 5 days ago
2.0 - 6.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Lifesight is a fast-growing SaaS company focused on helping businesses leverage data & AI to improve customer acquisition and retention. We have a team of 130 serving 300+ customers across 5 offices in the US, Singapore, India, Australia, and the UK. Our mission is to make it easy for non-technical marketers to leverage advanced data activation and marketing measurement tools that are powered by AI, to improve their performance and achieve their KPIs. Our product is being adopted rapidly globally and we need the best people onboard the team to accelerate our growth. Dealing with Petabytes of data and more than 400TB+ daily data processing to power attribution and measurement platforms of Lifesight, building scalable, highly available, fault-tolerant, big data platforms is critical for our success. From your first day at Lifesight, you'll make a valuable - and valued - contribution. We offer you the opportunity to delight customers around the world while gaining meaningful experience across a variety of disciplines. About The Role Lifesight is growing rapidly and seeking a strong Data Engineer to be a key member of the Data and Business Intelligence organization with a focus on deep data engineering projects. You will be joining as one of the few initial data engineers as part of the data platform team in our Bengaluru office. You will have an opportunity to help define our technical strategy and data engineering team culture in India. You will design and build data platforms and services while managing our data infrastructure in cloud environments that fuels strategic business decisions across Lifesight products. A successful candidate will be a self-starter, who drives excellence, is ready to jump into a variety of big data technologies & frameworks, and is able to coordinate and collaborate with other engineers, as well as mentor other engineers in the team. What You'll Be Doing Build quality data solutions and refine existing diverse datasets to simplified models encouraging self-service Build data pipelines that optimize on data quality and are resilient to poor quality data sources Low-level systems debugging, performance measurement & optimization on large production clusters Maintain and support existing platforms and evolve to newer technology stacks and architectures We're excited if you have 3+ years of professional experience as a data or software engineer Proficiency in Python and pyspark Deep understanding of Apache Spark, Spark tuning, creating RDDs, and building data frames. Create Java/ Scala Spark jobs for data transformation and aggregation. Experience in big data technologies like HDFS, Hive, Kafka, Spark, Airflow, Presto, etc. Experience working with various file formats like Parquet, Avro, etc for large volumes of data Experience with one or more NoSQL databases Experience with AWS, GCP Please submit your details here, if you are interested - https://forms.gle/rgzn7cdVcd2HnQQE7
Posted 5 days ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Solution Architect Big Data is a strategic professional who stays abreast of technology developments within own field and contributes to directional strategy by considering their application in own job and business. Recognized as a technical authority for an area within the business. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Significant impact on the area through complex deliverables. Provides advice and counsel related to the technology or operations. Work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. Responsibilities: Executes the architectural vision for all IT systems through major, complex IT architecture projects; ensures that architecture conforms to enterprise blueprints. Develops technology road maps, while keeping up-to-date with emerging technologies, and recommends business directions based on these technologies. Provides technical leadership and is responsible for developing components of, or the overall systems design. Translates complex business problems into sound technical solutions. Applies hardware engineering and software design theories and principles in researching, designing, and developing product hardware and software interfaces. Provides integrated systems planning and recommends innovative technologies that will enhance the current system. Recommends appropriate desktop, computer platform, and communication links required to support IT goals and strategy. Exhibits good knowledge of how own specialism contributes to the business and good understanding of competitors products and services. Acts as an advisor or mentor to junior team members. Requires sophisticated analytical thought to resolve issues in a variety of complex situations. Impacts the architecture function by influencing decisions through advice, counsel or facilitating services. Guides, influences and persuades others with developed communication and diplomacy skills. Performs other job duties and functions as assigned Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 12+ years' experience in Big Data and/or Public Cloud 8 years' experience working on Big Data technologies: Hadoop, HDFS, Hive, Spark, Impala, etc. Technical Expertise in financial services industry and/or regulatory environments Excellent knowledge and experience on solutioning Cloud native solutions Experience with migrating on-prem applications to Cloud Architectures or developing cloud native applications for any of the following: AWS, Azure, GCP, OpenShift Ability to work across technology stacks and perform R&D on new technologies Proficiency in one or more programming languages like Java, Python etc. Consistently demonstrates clear and concise written and verbal communication Management and prioritization skills Ability to develop working relationships Ability to manage multiple activities and changing priorities Ability to work under pressure and to meet tight deadlines Self-starter with ability to take the initiative and master new tasks quickly Methodical, attention to detail Preference: Experience on architecting Gen AI based technology solutions is preferred Education: Bachelor’s/University degree or equivalent experience, potentially Masters degree ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 5 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Data Engineer Intern – Xiaomi India Location: Bangalore, India Duration: 6 Months Internship Eligibility: College Passed out students (B.Tech/M.Tech in CS, IT or other related fields) Xiaomi is one of the world’s leading technology companies, with a strong presence in India across smartphones, smart devices, and internet services. At Xiaomi India, data is at the core of all strategic decisions. We’re looking for passionate Data Engineer Interns to work on high-impact projects involving large-scale data systems, data modeling, and pipeline engineering to support business intelligence, analytics, and AI use cases. Key Responsibilities Assist in building scalable data pipelines using Python and SQL. Support data modeling activities for analytics and reporting use cases. Perform data cleansing, transformation, and validation using PySpark. Collaborate with data engineers and analysts to ensure high data quality and availability. Work on Hadoop ecosystem tools to process large datasets. Contribute to data documentation and maintain version-controlled scripts Technical Skills Required Strong proficiency in Python for data processing and scripting. Good knowledge of SQL – writing complex queries, joins, aggregations Understanding of Data Modeling concepts – Star/Snowflake schema, Fact/Dimension tables. Familiarity with Big Data / Hadoop ecosystem – HDFS, Hive, Spark. Basic exposure to PySpark will be a strong plus. Experience with tools like Jupyter Notebook, VS Code, or any modern IDE. Exposure to cloud platforms (AWS/Azure/GCP/Databricks) is a bonus. Soft Skills Eagerness to learn and work in a fast-paced data-driven environment. Strong analytical thinking and attention to detail. Good communication and collaboration skills. Self-starter with the ability to work independently and in teams. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Preferred Education Master's Degree Required Technical And Professional Expertise Strong and proven background in Information Technology & working knowledge of .NET Core, C#, REST API, LINQ, Entity Framework, XUnit. Troubleshooting issues related to code performance. Working knowledge of Angular 15 or later, Typescript, Jest Framework, HTML 5 and CSS 3 & MS SQL Databases, troubleshooting issues related to DB performance Good understanding of CQRS, mediator, repository pattern. Good understanding of CI/CD pipelines and SonarQube & messaging and reverse proxy Preferred Technical And Professional Experience Good understanding of AuthN and AuthZ techniques like (windows, basic, JWT). Good understanding of GIT and it’s process like Pull request. Merge, pull, commit Methodology skills like AGILE, TDD, UML Show more Show less
Posted 5 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your Primary Responsibilities Include Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred Technical And Professional Experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description As a Research Analyst, you'll collaborate with experts to develop cutting-edge ML solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. You'll build scalable solutions, write high-quality code, and develop state-of-the-art ML models. You'll coordinate between science and software teams, optimizing solutions. The role requires thriving in ambiguous, fast-paced environments and working independently with ML models. Key job responsibilities Collaborate with seasoned Applied Scientists and propose best in class ML solutions for business requirements Dive deep to drive product pilots, demonstrate think big and customer obsession LPs to steer the product roadmap Build scalable solutions in partnership with Applied Scientists by developing technical intuition to write high quality code and develop state of the art ML models utilizing most recent research breakthroughs in academia and industry Coordinate design efforts between Sciences and Software teams to deliver optimized solutions Ability to thrive in an ambiguous, uncertain and fast moving ML usecase developments. Familiar with ML models and work independent. Mentor Junior Research Analyst (RAs) and contribute to RA hiring About The Team Retail Business Services Technology (RBS Tech) team develops the systems and science to accelerate Amazon’s flywheel. The team drives three core themes: 1) Find and Fix all customer and selling partner experience (CX and SPX) defects using technology, 2) Generate comprehensive insights for brand growth opportunities, and 3) Completely automate Stores tasks. Our vision for MLOE is to achieve ML operational excellence across Amazon through continuous innovation, scalable infrastructure, and a data-driven approach to optimize value, efficiency, and reliability. We focus on key areas for enhancing machine learning operations: a) Model Evaluation: Expanding LLM-based audit platform to support multilingual and multimodal auditing. Developing an LLM-powered testing framework for conversational systems to automate the validation of conversational flows, ensuring scalable, accurate, and efficient end-to-end testing. b) Guardrails: Building common guardrail APIs that teams can integrate to detect and prevent egregious errors, knowledge grounding issues, PII breaches, and biases. c) Deployment Framework support LLM deployments and seamlessly integrate it with our release management processes. Basic Qualifications Bachelor's degree in Quantitative or STEM disciplines (Science, Technology, Engineering, Mathematics) 3+ years of relevant work experience in solving real world business problems using machine learning, deep learning, data mining and statistical algorithms Strong hands-on programming skills in Python, SQL, Hadoop/Hive. Additional knowledge of Spark, Scala, R, Java desired but not mandatory Strong analytical thinking Ability to creatively solve business problems, innovating new approaches where required and articulating ideas to a wide range of audiences using strong data, written and verbal communication skills Ability to collaborate effectively across multiple teams and stakeholders, including development teams, product management and operations. Preferred Qualifications Master's degree with specialization in ML, NLP or Computer Vision preferred 3+ years relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) Diverse experience will be favored eg. a mix of experience across different roles - In-depth understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service - Technical expertise, experience in Data science, ML and Statistics Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2911939 Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Engineering Business Unit Overview The charter for Engineering group at Oportun is to be the world-class engineering force behind our innovative products. The group plays a vital role in designing, developing, and maintaining cutting-edge software solutions that power our mission and advance) our business. We strike a balance between leveraging leading tools and developing in-house solutions to create member experiences that empower their financial independence. The talented engineers in this group are dedicated to delivering and maintaining performant, elegant, and intuitive systems to our business partners and retail members. Our platform combines service-oriented platform features with sophisticated user experience and is enabled through a best-in-class (and fun to use!) automated development infrastructure. We prove that FinTech is more fun, more challenging, and in our case, more rewarding as we build technology that changes our members’ lives. Engineering at Oportun is responsible for high quality and scalable technical execution to achieve business goals and product vision. They ensure business continuity to members by effectively managing systems and services - overseeing technical architectures and system health. In addition, they are responsible for identifying and executing on the technical roadmap that enables product vision as well as fosters member & business growth in a scalable and efficient manner. The Enterprise Data and Technology (EDT) pillar within the Engineering Business Unit focusses on enabling wide use of corporate data assets whilst ensuring quality, availability and security across the data landscape. Position Overview As a Senior Data Engineer at Oportun, you will be a key member of our EDT team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross functional and multi-month long projects). Responsibilities Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise and define optimal data models and structures. Data Pipeline Development and Optimization: Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management and Optimization: Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality and Governance: Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship and Leadership: Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration and Stakeholder Management: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring and Optimization: Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Software Engineering Requirements You actively contribute to the end-to-end delivery of complex software applications, ensuring adherence to best practices and high overall quality standards. You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective software solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/Pyspark and Java /Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MySQL, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins and Airflow. Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3). Show more Show less
Posted 5 days ago
3.0 - 5.0 years
10 - 12 Lacs
Mumbai
On-site
JOB DESCRIPTION About the Advanced Analytics teamThe central Advanced Analytics team at the Abbott Established Pharma Division’s (EPD) headquarters in Basel helps define and lead the transformation towards becoming a global, data-driven company with the help of data and advanced technologies (e.g., Machine Learning, Deep Learning, Generative AI, Computer Vision). To us, Advanced Analytics is an important lever to reach our business targets, now and in the future; It helps differentiate ourselves from our competition and ensure sustainable revenue growth at optimal margins. Hence the central AA team is an integral part of the Strategy Management Office at EPD that has a very close link and regular interactions with the EPD Senior Leadership Team.Primary Job Function: With the above requirements in mind, EPD is looking to fill a role of a Data Scientist to build and refine effective Data Science Solutions for Abbott EPD world-wide.Core Job Responsibilities: The Data Scientist rapidly navigates from identifying priorities and helping to generate ideas to implementing solutions. TheyParticipate/drive data collection, cleaning, analysis and interpretation (EDA).Collaborate with the business partner and product owners to ideate on solutions to challenging problems.Generate insightful visualizations to communicate findings.Carry out model selection, validation and possible ways for deployment (in collaboration with the engineering team).Write high quality code with possibility of deployment in mind.Share the learnings and findings with other data scientists contributing to the collaborative environment.Collaborate with Sr. Data Scientists and take full responsibility for analysis and modeling tasks.Build effective and efficient AA solutions to business needs, leveraging available market resources as much as possible.Keep himself/herself committed to continuous learning about the latest trends and technologies.Work closely with the Product Owners and the Engineering team to ensure delivery of the Data Science part of the projects within time, cost and quality.Collaborate with external vendors, evaluating their capabilities and ensuring their alignment with data science standards and project requirements.Continuously engage in hands-on data analysis, modeling, and prototyping DS frameworks to deliver high-quality outputs.Supervisory/Management Responsibilities: Direct Reports: None.Indirect Reports: None.Position Accountability/Scope: The Data Scientist is responsible for delivering targeted business impact per initiative in collaboration with key stakeholders and identifying next steps/future impactful opportunities. This individual contributor role involves working with cross-functional teams to build innovative solutions for internal business functions across different geographies.Minimum Education: Master or PhD in relevant field (e.g., applied mathematics, computer science, engineering, applied statistics)Minimum Experience: At least 3-5 years of relevant working experience, ideally in pharma environmentSolid experience working on full-life cycle data science; experience in applying data science methods to business problems (experience in the financial/commercial or manufacturing / supply chain areas a plus).Strong experience in e.g., data mining, statistical modelling, predictive modelling, and development of machine learning algorithmsProven problem-solving ability in international settings preferably with developing marketsProven experience in working in cloud environment preferably AWS / SagemakerStrong experience working on full-life cycle data science; experience in applying data science methods to business problemsPractical experience in deploying machine learning solutionsStrong understanding of good software engineering principles and best practicesAbility to work and lead cross-functional teams to bring business and data science closer together - consultancy experience a plusIntrinsic motivation to guide people and make Advanced Analytics more accessible to a broader range of stakeholdersDeep domain expertise in a specific field, such as Artificial Intelligence, Machine Learning, Natural Language Processing, or Computer VisionStrong programming skills in languages such as Python or R, with proficiency in data manipulation, wrangling, and modeling techniquesStrong experience building and debugging complex SQL queriesExcellent knowledge of statistical techniques, machine learning algorithms, and their practical implementation in real-world scenariosExceptional communication and presentation skills, with the ability to convey complex concepts and insights to both technical and non-technical stakeholdersProven track record of delivering data-driven solutions that have had a measurable impact on business outcomesExposure to big data technologies (e.g., Hadoop, Spark) is highly desirableDemonstrated ability to drive the adoption of data science best practices, standards, and methodologies within an organizationFluency in English a must, additional languages a plus
Posted 5 days ago
12.0 years
0 Lacs
Bengaluru
On-site
As a Senior Manager for data science, data modelling & Analytics, you will lead a team of data scientists and analysts while actively contributing to the development and implementation of advanced analytics solutions. This role requires a blend of strategic leadership and hands-on technical expertise to drive data-driven decision-making across the organization. Job Description: Key Responsibilities Hands-On Technical Contribution Design, develop, and deploy advanced machine learning models and statistical analyses to solve complex business problems. Utilize programming languages such as Python, R, and SQL to manipulate data and build predictive models. Understand end-to-end data pipelines, including data collection, cleaning, transformation, and visualization. Collaborate with IT and data engineering teams to integrate analytics solutions into production environments. Provide thought leadership on solutions and metrics based on the understanding of nature of business requirement. Team Leadership & Development Lead, mentor, and manage a team of data scientists and analysts, fostering a collaborative and innovative environment. Provide guidance on career development, performance evaluations, and skill enhancement. Promote continuous learning and adoption of best practices in data science methodologies. Engage and manage a hierarchical team while fostering a culture of collaboration. Strategic Planning & Execution Collaborate with senior leadership to define the data science strategy aligned with business objectives. Identify and prioritize high-impact analytics projects that drive business value. Ensure the timely delivery of analytics solutions, balancing quality, scope, and resource constraints. Client Engagement & Stakeholder Management Serve as the primary point of contact for clients, understanding their business challenges and translating them into data science solutions. Lead client presentations, workshops, and discussions to communicate complex analytical concepts in an accessible manner. Develop and maintain strong relationships with key client stakeholders, ensuring satisfaction and identifying opportunities for further collaboration. Manage client expectations, timelines, and deliverables, ensuring alignment with business objectives. Develop and deliver regular reports and dashboards to senior management, market stakeholders and clients highlighting key insights and performance metrics. Act as a liaison between technical teams and business units to align analytics initiatives with organizational goals. Cross-Functional Collaboration Work closely with cross capability teams such as Business Intelligence, Market Analytics, Data engineering to integrate analytics solutions into business processes. Translate complex data insights into actionable recommendations for non-technical stakeholders. Facilitate workshops and presentations to promote data driven conversations across the organization. Closely working with support functions to provide timely updates to leadership on operational metrics. Governance & Compliance Ensure adherence to data governance policies, including data privacy regulations (e.g., GDPR, PDPA). Implement best practices for data quality, security, and ethical use of analytics. Stay informed about industry trends and regulatory changes impacting data analytics. Qualifications Education: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. Experience: 12+ years of experience in advanced analytics, data science, data modelling, machine learning, Generative AI or a related field with 5+ years in a leadership capacity. Proven track record of managing and delivering complex analytics projects. Familiarity with the BFSI/Hi Tech/Retail/Healthcare industry and experience with product, transaction, and customer-level data Experience with media data will be advantageous Technical Skills: Proficiency in programming languages like Python, R, or SQL. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with big data platforms (e.g., Hadoop, Spark) and cloud services (e.g., AWS, GCP, Azure). Knowledge of machine learning frameworks and libraries. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to influence and drive change within the organization. Strategic thinker with a focus on delivering business outcomes. Desirable Attributes Proficient in the following advanced analytics techniques (Should have proficiency in most) Descriptive Analytics: Statistical analysis, data visualization. Predictive Analytics: Regression analysis, time series forecasting, classification techniques, market mix modelling Prescriptive Analytics: Optimization, simulation modelling. Text Analytics: Natural Language Processing (NLP), sentiment analysis. Extensive knowledge of machine learning techniques, including (Should have proficiency in most) Supervised Learning: Linear regression, logistic regression, decision trees, support vector machines, random forests, gradient boosting machines among others Unsupervised Learning: K-means clustering, hierarchical clustering, principal component analysis (PCA), anomaly detection among others Reinforcement Learning: Q-learning, deep Q-networks, etc. Experience with Generative AI and large language models (LLMs) for text generation, summarization, and conversational agents (Good to Have) Researching, loading and application of the best LLMs (GPT, Gemini, LLAMA, etc.) for various objectives Hyper parameter tuning Prompt Engineering Embedding & Vectorization Fine tuning Proficiency in data visualization tools such as Tableau or Power BI (Good to Have) Strong skills in data management, structuring, and harmonization to support analytical needs (Must have) Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 5 days ago
2.0 years
3 - 5 Lacs
Bengaluru
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Do Research, design, develop, and modify computer vision and machine learning. algorithms and models, leveraging experience with technologies such as Caffe, Torch, or TensorFlow. - Shape product strategy for highly contextualized applied ML/AI solutions by engaging with customers, solution teams, discovery workshops and prototyping initiatives. - Help build a high-impact ML/AI team by supporting recruitment, training and development of team members. - Serve as evangelist by engaging in the broader ML/AI community through research, speaking/teaching, formal collaborations and/or other channels. Knowledge & Abilities: - Designing integrations of and tuning machine learning & computer vision algorithms - Research and prototype techniques and algorithms for object detection and recognition - Convolutional neural networks (CNN) for performing image classification and object detection. - Familiarity with Embedded Vision Processing systems - Open source tools & platforms - Statistical Modeling, Data Extraction, Analysis, - Construct, train, evaluate and tune neural networks Mandatory Skills: One or more of the following: Java, C++, Python Deep Learning frameworks such as Caffe OR Torch OR TensorFlow, and image/video vision library like OpenCV, Clarafai, Google Cloud Vision etc Supervised & Unsupervised Learning Developed feature learning, text mining, and prediction models (e.g., deep learning, collaborative filtering, SVM, and random forest) on big data computation platform (Hadoop, Spark, HIVE, and Tableau) *One or more of the following: Tableau, Hadoop, Spark, HBase, Kafka Experience: - 2-5 years of work or educational experience in Machine Learning or Artificial Intelligence - Creation and application of Machine Learning algorithms to a variety of real-world problems with large datasets. - Building scalable machine learning systems and data-driven products working with cross functional teams - Working w/ cloud services like AWS, Microsoft, IBM, and Google Cloud - Working w/ one or more of the following: Natural Language Processing, text understanding, classification, pattern recognition, recommendation systems, targeting systems, ranking systems or similar Nice to Have: - Contribution to research communities and/or efforts, including publishing papers at conferences such as NIPS, ICML, ACL, CVPR, etc. Education: BA/BS (advanced degree preferable) in Computer Science, Engineering or related technical field or equivalent practical experience Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law Product and Services Sales Manager ͏ ͏ ͏ ͏ Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 5 days ago
5.0 years
10 - 12 Lacs
Bhopal
On-site
About the Role : We are seeking a highly skilled Senior AI/ML Engineer to join our dynamic team. The ideal candidate will have extensive experience in designing, building, and deploying machine learning models and AI solutions to solve real-world business challenges. You will collaborate with cross-functional teams to create and integrate AI/ML models into end-to-end applications, ensuring models are accessible through APIs or product interfaces for real-time usage. Responsibilities Lead the design, development, and deployment of machine learning models for various use cases such as recommendation systems, computer vision, natural language processing (NLP), and predictive analytics. Work with large datasets to build, train, and optimize models using techniques such as classification, regression, clustering, and neural networks. Fine-tune pre-trained models and develop custom models based on specific business needs. Collaborate with data engineers to build scalable data pipelines and ensure the smooth integration of models into production. Collaborate with frontend/backend engineers to build AI-driven features into products or platforms. Build proof-of-concept or production-grade AI applications and tools with intuitive UIs or workflows. Ensure scalability and performance of deployed AI solutions within the full application stack. Implement model monitoring and maintenance strategies to ensure performance, accuracy, and continuous improvement of deployed models. Design and implement APIs or services that expose machine learning models to frontend or other systems Utilize cloud platforms (AWS, GCP, Azure) to deploy, manage, and scale AI/ML solutions. Stay up-to-date with the latest advancements in AI/ML research, and apply innovative techniques to improve existing systems. Communicate effectively with stakeholders to understand business requirements and translate them into AI/ML-driven solutions. Document processes, methodologies, and results for future reference and reproducibility. Required Skills & Qualifications Experience : 5+ years of experience in AI/ML engineering roles, with a proven track record of successfully delivering machine learning projects. AI/ML Expertise : Strong knowledge of machine learning algorithms (supervised, unsupervised, reinforcement learning) and AI techniques, including NLP, computer vision, and recommendation systems. Programming Languages : Proficient in Python and relevant ML libraries such as TensorFlow, PyTorch, Scikit-learn, and Keras. Data Manipulation : Experience with data manipulation libraries such as Pandas, NumPy, and SQL for managing and processing large datasets. Model Development : Expertise in building, training, deploying, and fine-tuning machine learning models in production environments. Cloud Platforms : Experience with cloud platforms such as AWS, GCP, or Azure for the deployment and scaling of AI/ML models. MLOps : Knowledge of MLOps practices for model versioning, automation, and monitoring. Data Preprocessing : Proficient in data cleaning, feature engineering, and preparing datasets for model training. Strong experience building and deploying end-to-end AI-powered applications— not just models but full system integration. Hands-on experience with Flask, FastAPI, Django, or similar for building REST APIs for model serving. Understanding of system design and software architecture for integrating AI into production environments. Experience with frontend/backend integration (basic React/Next.js knowledge is a plus). Demonstrated projects where AI models were part of deployed user-facing applications. NLP & Computer Vision: Hands-on experience with natural language processing or computer vision projects. Big Data: Familiarity with big data tools and frameworks (e.g., Apache Spark, Hadoop) is an advantage. Problem-Solving Skills: Strong analytical and problem-solving abilities, with a focus on delivering practical AI/ML solutions. Nice to Have Experience with deep learning architectures (CNNs, RNNs, GANs, etc.) and techniques. Knowledge of deployment strategies for AI models using APIs, Docker, or Kubernetes. Experience building full-stack applications powered by AI (e.g., chatbots, recommendation dashboards, AI assistants, etc.). Experience deploying AI/ML models in real-time environments using API gateways, microservices, or orchestration tools like Docker and Kubernetes. Solid understanding of statistics and probability. Experience working in Agile development environments. What You'll Gain Be part of a forward-thinking team working on cutting-edge AI/ML technologies. Collaborate with a diverse, highly skilled team in a fast-paced environment. Opportunity to work on impactful projects with real-world applications. Competitive salary and career growth opportunities Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Schedule: Day shift Fixed shift Work Location: In person
Posted 5 days ago
6.0 - 9.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents. Division Profile The Analytics & Data (A&D) organization is a key growth area within Morgan Stanley’s Wealth Management Division, playing a critical role in the execution of the wider Wealth Management strategy. The use of analytics and data is a key driver in accelerating growth across WM business, enabling data-driven decision-making and delivering the best client experience. Position Summary Director level role to join our Analytics & Data organization as part of the Key Metrics Analytics function. The candidate will be responsible for the ongoing monitoring, analysis and reporting on key business data metrics for the management and stakeholders, collaborating to help determine timely solutions and providing actionable insights. Key Responsibilities Key responsibilities will include, but will not be limited to the following: Analytical skills – Analyze data to identify key trends and anomalies for key business metrics. Problem solving skills – Ability to identify key issues, gather data to investigate those issues and develop actionable recommendations. Develop a suite of processes needed for data analysis, trends identification & and detect data discrepancies. Teamwork skills – The candidate must be flexible in their work style and be able to work collaboratively with A&D team-members in Mumbai, NY/NJ, and Budapest. Provide reporting support as needed. Experience 6-9 years’ experience in data-centric role Bachelor’s degree required. Experience in Data & Analytics and/or Financial Services is a plus. Required Skills Ability to work independently and possess a strong sense of accountability/ownership. Must be a self-starter and a quick learner, able to prioritize and delegate effectively to manage delivery/execution of a wide range of tasks and initiatives. Strong written, verbal, presentation, and interpersonal skills Proficient in SQL, MS Excel, PowerPoint Experience or knowledge in these skills is a plus – Snowflake, Hadoop, Python, Dataiku, Tableau Ability to work in a team environment and partner with multiple individuals across various groups. Strong analytical and problem-solving skills. Strong diligence, aptitude for working with numerical information. Experience and strong aptitude in using different data environments. Demonstrated effectiveness with both oral and written communication. Comfortable in a fast-paced and evolving environment which includes ongoing learning and training opportunities. Experience in managing a small team (2-3 headcount) is a plus. What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents. Show more Show less
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.
These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.
The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.
In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.
In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.
As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16869 Jobs | Dublin
Wipro
9024 Jobs | Bengaluru
EY
7266 Jobs | London
Amazon
5652 Jobs | Seattle,WA
Uplers
5629 Jobs | Ahmedabad
IBM
5547 Jobs | Armonk
Oracle
5387 Jobs | Redwood City
Accenture in India
5156 Jobs | Dublin 2
Capgemini
3242 Jobs | Paris,France
Tata Consultancy Services
3099 Jobs | Thane