Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
9 - 12 Lacs
Hyderabad
Hybrid
Responsibilities of the Candidate : - Be responsible for the design and development of big data solutions. Partner with domain experts, product managers, analysts, and data scientists to develop Big Data pipelines in Hadoop - Be responsible for moving all legacy workloads to a cloud platform - Work with data scientists to build Client pipelines using heterogeneous sources and provide engineering services for data PySpark science applications - Ensure automation through CI/CD across platforms both in cloud and on-premises - Define needs around maintainability, testability, performance, security, quality, and usability for the data platform - Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes - Convert SAS-based pipelines into languages like PySpark, and Scala to execute on Hadoop and non-Hadoop ecosystems - Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance - Apply an in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. - Produce a detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken. - Assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets, by driving compliance with applicable laws, rules, and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing and reporting control issues with transparency Requirements : - 6+ years of total IT experience - 3+ years of experience with Hadoop (Cloudera)/big data technologies - Knowledge of the Hadoop ecosystem and Big Data technologies Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr) - Experience in designing and developing Data Pipelines for Data Ingestion or Transformation using Java Scala or Python. - Experience with Spark programming (Pyspark, Scala, or Java) - Hands-on experience with Python/Pyspark/Scala and basic libraries for machine learning is required. - Proficient in programming in Java or Python with prior Apache Beam/Spark experience a plus. - Hand on experience in CI/CD, Scheduling and Scripting - Ensure automation through CI/CD across platforms both in cloud and on-premises - System level understanding - Data structures, algorithms, distributed storage & compute - Can-do attitude on solving complex business problems, good interpersonal and teamwork skills.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Cloud Support Engineer at Snowflake, you will be an integral part of the expanding Support team, dedicated to providing high-quality resolutions and delivering data-driven business insights to customers. Your technical expertise and passion for Snowflake Data Warehouse will be essential in guiding customers on effective and optimal use of the platform. Your primary responsibilities will include driving technical solutions to complex problems, maintaining response and resolution SLAs, documenting solutions, reporting bugs and feature requests, and collaborating with engineering teams. You will also participate in strategic initiatives, provide support coverage as needed, and work closely with Snowflake Priority Support customers to ensure the highest levels of continuity and performance. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science or a related field, along with at least 5 years of experience in a Technical Support environment. Your solid knowledge of RDBMS, SQL data types, aggregations, and advanced functions, as well as experience in query performance tuning and system metrics interpretation, will be highly beneficial. In addition to the core requirements, the ideal candidate will have expertise in database patch and release management, distributed computing principles, scripting/coding, database migration, and ETL experience. Proficiency in monitoring and optimizing cloud spending using cost management tools and strategies will be considered a plus. Special requirements for this role include participating in pager duty rotations during nights, weekends, and holidays, as well as the ability to work the 4th/night shift starting from 10 pm IST. Applicants should be flexible with schedule changes to accommodate business needs. If you are looking to join a fast-growing team that values innovation, customer success, and continuous improvement, Snowflake offers a dynamic environment where you can make a significant impact on both the company and your own professional growth. Join us in redefining what's possible with data and technology.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As an Experienced Senior Data Engineer at Adobe, you will utilize Big Data and Google Cloud technologies to develop large-scale, on-cloud data processing pipelines and data warehouses. Your role will involve consulting with customers worldwide on their data engineering needs around Adobe's Customer Data Platform and supporting pre-sales discussions regarding complex and large-scale cloud data engineering solutions. You will design custom solutions on cloud by integrating Adobe's solutions in a scalable and performant manner. Additionally, you will deliver complex, large-scale, enterprise-grade on-cloud data engineering and integration solutions in a hands-on manner. To be successful in this role, you should have a total of 12 to 15 years of experience, with 3 to 4 years of experience leading Data Engineer teams in developing enterprise-grade data processing pipelines on Google Cloud. You must have led at least one project of medium to high complexity involving the migration of ETL pipelines and Data warehouses to the cloud. Your recent 3 to 5 years of experience should be with premium consulting companies. Profound hands-on expertise with Google Cloud Platform services, especially BigQuery, Dataform, Dataplex, etc., is essential. Exceptional communication skills are crucial for effectively engaging with Data Engineers, Technology, and Business leadership. Furthermore, the ability to leverage knowledge of GCP to other cloud environments is highly desirable. It would be advantageous to have experience consulting with customers in India and possess multi-cloud expertise, with knowledge of AWS and GCP. At Adobe, creativity, curiosity, and continuous learning are valued qualities that contribute to your career growth journey. To pursue a new opportunity at Adobe, ensure to update your Resume/CV and Workday profile, including your unique Adobe experiences and volunteer work. Familiarize yourself with the Internal Mobility page on Inside Adobe to understand the process and set up job alerts for roles that interest you. Prepare for interviews by following the provided tips. Upon applying for a role via Workday, the Talent Team will contact you within 2 weeks. If you progress to the official interview process with the hiring team, inform your manager to support your career growth. At Adobe, you will experience an exceptional work environment recognized globally. You will collaborate with colleagues dedicated to mutual growth through the Check-In approach, where ongoing feedback is encouraged. If you seek to make an impact, Adobe is the ideal place for you. Explore employee career experiences on the Adobe Life blog and discover the meaningful benefits offered. For individuals with disabilities or special needs requiring accommodation to navigate the Adobe.com website or complete the application process, contact accommodations@adobe.com or call (408) 536-3015.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
The Sr. Manager in Industry X at our New Delhi office plays a crucial role in driving client value creation through Industry X (Digital Transformation) projects and supporting the end-to-end sales lifecycle. As an Industry X expert, you will conduct assessments of client manufacturing operations to identify opportunities for Industry X interventions. You should have a solid grasp of Industry X concepts and technologies like Industrial IoT, Predictive Maintenance, and Digital Twins, along with a deep understanding of lean manufacturing principles and their integration with Industry X solutions. Experience with Manufacturing Execution Systems (MES) and other relevant industry-specific software is considered advantageous. Your responsibilities will include analyzing client needs, developing actionable roadmaps utilizing Industry X technologies such as IoT, Big Data, Cloud, AI, and Machine Learning, and presenting compelling business cases showcasing the ROI potential of Industry X initiatives. You will lead client engagements, collaborate with internal and external stakeholders, and stay updated on the latest Industry X trends to provide cutting-edge insights to clients. The ideal candidate for this position will have a minimum of 10 years of experience in management consulting, with a focus on Resources industries like Automotive, Electronics & Semiconductors, and Machinery & Equipment. You should be adept at successfully managing complex client engagements within the discrete manufacturing domain and have a strong background in utilizing data and technologies such as AR/VR, cloud, AI, 5G, robotics, and digital twins. To excel in this role, you will need a Bachelor's degree in Engineering, Business Administration, or a related field, with a specialization in industrial engineering, manufacturing engineering, or a similar discipline. This position offers the opportunity to work with market leaders in building resilient and agile businesses that adapt engineering, infrastructure & capital projects, and manufacturing operations in the face of change. If you are passionate about driving digital transformation in the manufacturing sector and possess the required expertise and experience, we invite you to join our team at Accenture as a Sr. Manager in Industry X.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Associate at PricewaterhouseCoopers (PwC), you will be part of a team of problem solvers dedicated to addressing complex business challenges from strategy to execution. In this role, you will be expected to leverage your skills and knowledge in various areas, including but not limited to: - Utilizing feedback and reflection to enhance self-awareness and personal growth. - Delegating tasks to provide growth opportunities for team members and coaching them to achieve results. - Demonstrating critical thinking skills and the ability to structure solutions for intricate problems. - Employing a wide range of tools and techniques to derive insights from current industry trends. - Leading day-to-day tasks within the team, assisting in work planning, and ensuring quality, accuracy, and relevance in the work done. - Contributing to practice enablement and business development initiatives. - Learning new tools and technologies as necessary. - Developing and implementing automation solutions aligned with client requirements. - Demonstrating proficiency in selecting and utilizing tools for different situations, with the ability to justify the choice. - Communicating effectively and persuasively while engaging with others. - Adhering to the firm's ethical standards and business conduct guidelines. The ideal candidate for this position should possess the following qualifications and experience: - Dual degree/Master's degree from reputable institutes in fields such as Data Science, Data Analytics, Finance, Accounting, Business Administration/Management, Economics, Statistics, Computer and Information Science, Management Information Systems, Engineering, or Mathematics. - 4-7 years of work experience in analytics consulting and/or transaction services with leading consulting organizations. - Experience throughout the entire Deals Cycle, including diligence, post-deal value creation, and exit preparation. In addition to the above, candidates with knowledge and skills in the following areas will be preferred: Business: - Ability to effectively manage stakeholder interactions and relationships, particularly in the US. - Experience in high-performing teams, preferably in data analytics, consulting, and/or private equity. - Strong expertise in Analytics Consulting, with a demonstrated ability to translate complex data into actionable insights. - Proficiency in utilizing business frameworks to analyze markets, evaluate company positioning and performance. - Experience working with alternative data and market datasets to derive insights on competitive positioning and company performance. - Understanding of financial statements, business cycles, diligence processes, financial modeling, valuation, etc. - Experience in a collaborative environment and delivering under time-sensitive client deadlines. - Providing valuable insights by comprehending clients" businesses, industries, and value drivers. - Excellent communication and presentation skills. Technical: - Collaborative, innovative, and resourceful in applying tools and techniques to address client queries. - Ability to synthesize insights and recommendations into concise and comprehensive client presentations. - Proven track record in data extraction, transformation, analytics, and visualization. - Proficiency in tools such as Alteryx, Pyspark, Python, Advanced Excel, PowerBI (including visualization and DAX), and MS Office. - Familiarity with GenAI/Large Language Models (LLMs) is a plus. - Experience in big data and machine learning concepts. - Strong background in utilizing data and business intelligence software to derive actionable insights.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Qualcomm India Private Limited is a leading technology innovator pushing the boundaries of what's possible to enable next-generation experiences and drive digital transformation for a smarter, connected future. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and specialized utility programs to launch cutting-edge, world-class products exceeding customer needs. Collaborate with various teams to design system-level software solutions and gather performance requirements and interfaces. Minimum Qualifications: - Possess a Bachelor's degree in Engineering, Information Systems, Computer Science, or a related field. Senior Machine Learning & Data Engineer: Join our team as a Senior Machine Learning & Data Engineer with expertise in Python development. Design scalable data pipelines, build and deploy ML/NLP models, and enable data-driven decision-making within the organization. Key Responsibilities: - Data Engineering & Infrastructure: Design and implement robust ETL pipelines and data integration workflows using SQL, NoSQL, and big data technologies. - Machine Learning & NLP: Build, fine-tune, and deploy ML/NLP models using frameworks like TensorFlow, PyTorch, and Scikit-learn. - Python Development: Develop scalable backend services using Python frameworks such as FastAPI, Flask, or Django. - Collaboration & Communication: Work closely with cross-functional teams to integrate ML solutions into production systems. Required Qualifications: - Hold a Bachelors or Masters degree in Computer Science, Engineering, or a related field. - Possess strong Python programming skills and experience with modern libraries and frameworks. - Deep understanding of ML/NLP concepts and practical experience with LLMs and RAG architectures. Automation Engineer: As an Automation Engineer proficient in C#/Python development, you will play a crucial role in developing advanced solutions for Product Test automation. Collaborate with stakeholders to ensure successful implementation and operation of automation solutions. Responsibilities: - Design, develop, and maintain core APIs using C#. - Identify, troubleshoot, and optimize API development and testing. - Stay updated with industry trends in API development. Requirements: - Hold a Bachelor's degree in Computer Science, Engineering, or a related field. - Proven experience in developing APIs using C# and Python. - Strong understanding of software testing principles and methodologies. Qualcomm is an equal opportunity employer committed to providing accessible processes for individuals with disabilities. For accommodations, contact disability-accommodations@qualcomm.com.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Scientist at mPokket, you will be an integral part of our rapidly growing fintech startup in the lending domain. We specialize in offering micro loans to college students and young professionals through our mobile app, with over 11 million loans disbursed. As a profitable organization backed by a US based private equity firm, we are now poised to scale our business 10x-20x in the upcoming years. If you are an ambitious individual with a passion for working with raw data to derive meaningful insights, we invite you to join us on this incredible journey. Your primary responsibility will be to collaborate with the data science team to plan and execute projects aimed at building advanced analytics models. You should possess a strong problem-solving ability, a flair for statistical analysis, and the capability to align data products with our business objectives. By leveraging data effectively, your role will contribute towards enhancing our products and driving informed business decisions. Key Responsibilities: - Supervise the data scientists team and data specialists to ensure seamless project execution - Mentor and guide colleagues on innovative techniques and solutions - Work closely with data and software engineers to implement scalable technologies across the organizational ecosystem - Conceptualize, plan, and prioritize data projects in alignment with business goals - Develop analytic systems, predictive models, and explore new methodologies to enhance performance - Ensure that all data projects are in sync with the overall organizational objectives Minimum Qualifications: - Master's degree in Computer Science, Operations Research, Econometrics, Statistics, or a related technical field - Over 6 years of experience in solving analytical problems using quantitative approaches - Proficient in communicating quantitative analysis results effectively - Familiarity with relational databases, SQL, and scripting languages like Python, PHP, or Perl - Knowledge of statistics including hypothesis testing and regressions - Experience in manipulating data sets using statistical software like R, SAS, or other similar tools Technical Skills: Must have: - Programming: Python (Preferred) / R - ML Models: Regression (Linear, Logistic, Multinomial, Mixed effect), Classification (Random Forest, Decision tree, SVM), Clustering (K-Means, hierarchical, DB-Scan), Time series (ARIMA, SARIMA, ARIMAX, Holt-Winters, Multi TS, UCM), Neural Networks, Naive Bayes - Excel and SQL - Dimensionality Reduction: PCA, SVD, etc. - Optimization Techniques: Linear programming, Gradient Descent, Genetic Algorithm - Cloud: Understanding of Azure / AWS offerings, Setting up ML pipeline on cloud Good to have: - Visualization: Tableau, Power BI, Looker, QlikView - Data management: HDFS, Spark, Advanced Excel - Agile Tools: Azure DevOps, JIRA - Pyspark - Big Data/Hive Database - IDE: Pycharm,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As an Artificial Intelligence Specialist at Gen AI, you will be responsible for driving customer conversations, understanding customer requirements, creating Gen AI solution architectures, and developing customer proposals and RFP responses. You will guide solution engineers in creating Gen AI POCs and solutions for various industry verticals. Your role will involve staying updated on the latest technology developments, industry best practices, and incorporating them into Gen AI applications. Additionally, you will design and deploy Proof of Concepts (POCs) and Points of View (POVs) across different industry verticals to showcase the potential of Generative AI applications. To qualify for this role, you should have at least 8 years of experience in software development, with a minimum of 3 years of experience in Generative AI solution development. A bachelor's degree or higher in Computer Science, Software Engineering, or related fields is required. You should be adept at critical thinking, logical reasoning, and have a strong ability to learn new industry domains quickly. Being a team player who can deliver under pressure is essential. Furthermore, you should have experience with cloud technologies such as Azure, AWS, or GCP, as well as a good understanding of NVIDIA or similar technologies. A solid appreciation of AI/ML concepts and sound design principles is necessary for this role. In terms of required skills, you should be extremely dynamic and enthusiastic about technology. Development experience with languages like C++, Java, JavaScript, HTML, C#, Python, or node.js is preferred. You should be able to adapt quickly to new challenges and evolving technology stacks. Excellent written and verbal communication skills in English are essential, along with strong analytical and critical thinking abilities. A customer-focused attitude, initiative-taking, self-driven nature, and the ability to learn quickly are also important qualities for this role. Knowledge of Python, ML Algorithms, Statistics, source code maintenance, versioning tools, Object-Oriented Programming Concepts, debugging, and analytical skills is required. Preferred skills for this position include at least 5 years of experience in ML development and MLOps. Strong programming skills in Python, knowledge of ML, Data, and API libraries, and expertise in creating end-to-end data pipelines are advantageous. Experience with ML models, ModelOps/MLOps, AutoML, AI Ethics, Trust, Explainable AI, and popular ML frameworks like SparkML, TensorFlow, scikit-learn, XGBoost, H2O, etc., is beneficial. Familiarity with working in cloud environments (AWS, Azure, GCP) or containerized environments (Mesos, Kubernetes), interest in understanding functional and industry business challenges, and knowledge of IT industry and GenAI use cases in insurance processes are preferred. Expertise in Big Data and Data Modeling is also desirable for this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Big Data Engineer at UST, you will be responsible for designing, developing, and maintaining scalable and distributed data architectures capable of processing large volumes of data. Your role will involve implementing and optimizing data storage solutions using technologies such as Hadoop, Spark, and PySpark. You will be required to develop and implement efficient ETL processes using PySpark to extract, transform, and load large datasets, as well as optimize PySpark applications for better performance, scalability, and resource management. To excel in this role, you should have proven experience as a Big Data Engineer with a strong focus on PySpark. A deep understanding of Big Data processing frameworks and technologies is essential, along with strong proficiency in PySpark for developing and optimizing ETL processes and data transformations. Experience with distributed computing and parallel processing will be beneficial, and the ability to collaborate in a fast-paced, innovative environment is crucial. Key Responsibilities: - Design, develop, and maintain scalable and distributed data architectures capable of processing large volumes of data. - Implement and optimize data storage solutions using technologies such as Hadoop, Spark, and PySpark. - Develop and implement efficient ETL processes using PySpark to extract, transform, and load large datasets. - Optimize PySpark applications for better performance, scalability, and resource management. Qualifications: - Proven experience as a Big Data Engineer with a strong focus on PySpark. - Deep understanding of Big Data processing frameworks and technologies. - Strong proficiency in PySpark for developing and optimizing ETL processes and data transformations. - Experience with distributed computing and parallel processing. - Ability to collaborate in a fast-paced, innovative environment. Skills required: - PySpark - Big Data - Python Join UST, a global digital transformation solutions provider, and work alongside the world's best companies to make a real impact through transformation. With over 20 years of experience, UST partners with clients from design to operation, embedding innovation and agility into their organizations. Be part of a team of over 30,000 employees in 30 countries, building for boundless impact and touching billions of lives in the process.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your overall objective in this role is to contribute to applications systems analysis and programming activities. Your responsibilities will include having strong knowledge in Big data, Hadoop, or Cloud Era, proficiency in programming languages such as spark or scala, and having good working knowledge in Kafka API. You will consult with users, clients, and other technology groups on issues, recommend programming solutions, install, and support customer exposure systems. Additionally, you will apply fundamental knowledge of programming languages for design specifications, analyze applications to identify vulnerabilities and security issues, and conduct testing and debugging. You will also serve as an advisor or coach to new or lower-level analysts, identify problems, analyze information, make evaluative judgments to recommend and implement solutions, and resolve issues by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. You should have the ability to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, act as a subject matter expert to senior stakeholders and other team members, and appropriately assess risk when making business decisions. To qualify for this position, you should have 4-6 years of relevant experience in the Financial Service industry, intermediate-level experience in an Applications Development role, consistently demonstrate clear and concise written and verbal communication, possess demonstrated problem-solving and decision-making skills, and have the ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements. Education requirements include a Bachelor's degree/University degree or equivalent experience. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for analyzing genetics, epigenetics, metagenomics, and metabolomics data obtained from high-throughput multiplex assays using advanced computational techniques. Your role will involve integrating various datasets to address specific biological questions and designing pipelines for statistical genetics programs for both basic and translational research purposes. Additionally, you will be expected to write manuscripts and contribute to various research studies while managing project deliveries, responding to customer queries, and addressing custom requirements from clients. As a Senior Bioinformatician, you must possess a basic understanding of molecular biology and genomics. Hands-on experience in NGS data analysis, including RNA-seq, Exome-seq, ChIP-seq, and downstream analysis, is essential. Proficiency in programming using Perl or Python, as well as experience in designing and developing bioinformatics pipelines, is required. Preferably, you should also have experience in statistical programming using R, Bioconductor, or Matlab. Knowledge in WGS, GWAS, and population genomics is advantageous. You should be adept at handling big data and capable of guiding team members across multiple projects simultaneously. Collaboration with different groups of clinical research scientists for various project requirements is expected. The ability to work effectively as part of a team or independently with minimal support is essential for success in this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 5+ years of work experience in building infrastructure solutions based on customer requirements. The location for this position is in Chennai/Hyderabad. It is essential to have good working experience with Data Center products like X86 Servers, Storage, HCI AI Solutions. Your responsibilities will include building x86 based solutions using 3-Tier or HCI based architecture, as well as building Storage solutions such as SAN, NAS, SDS. A good understanding of server and storage virtualization options is necessary, along with knowledge of different types of workloads like Database, SAP HANA, and Big Data. You should also have a fair understanding of Cloud technology, including its suitability for workloads, building a cloud stack, and integration with public cloud offerings like Azure, AWS, and GCP. Demonstrating a consultative approach in engaging with customers to explore and understand their current landscape, suggesting modernization options, and improving efficiency will be a key part of your role. Good presentation and whiteboarding skills are required to explain solutions to customers and determine the optimal fit for their needs. Experience in working on RFPs, both in building RFPs and responding to them, would be beneficial. You will be responsible for building configurations using Lenovo tools for the infrastructure solutions proposed to customers. Supporting Business Partners by engaging with their end customers and making product portfolio and concept presentations at Marketing events or to educate partners will also be part of your responsibilities. Collaboration is key, as you should be a team player working with other Solution consultants and the Sales team towards the common goal of providing the best solution to customers and winning their confidence. This position requires a candidate who is a graduate and possesses strong teamwork and communication skills. If you are excited about being part of Lenovo's transformative journey and contributing to a more inclusive, trustworthy, and smarter future, visit www.lenovo.com for more information and updates.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Senior Data Modeller, you will be responsible for leading the design and development of conceptual, logical, and physical data models for enterprise and application-level databases. Your expertise in data modeling, data warehousing, and data governance, particularly in cloud environments, Databricks, and Unity Catalog, will be crucial for the role. You should have a deep understanding of business processes related to master data management in a B2B environment and experience with data governance and data quality concepts. Your key responsibilities will include designing and developing data models, translating business requirements into structured data models, defining and maintaining data standards, collaborating with cross-functional teams to implement models, analyzing existing data systems for optimization, creating entity relationship diagrams and data flow diagrams, supporting data governance initiatives, and ensuring compliance with organizational data policies and security requirements. To be successful in this role, you should have at least 12 years of experience in data modeling, data warehousing, and data governance. Strong familiarity with Databricks, Unity Catalog, and cloud environments (preferably Azure) is essential. Additionally, you should possess a background in data normalization, denormalization, dimensional modeling, and schema design, along with hands-on experience with data modeling tools like ERwin. Experience in Agile or Scrum environments, proficiency in integration, databases, data warehouses, and data processing, as well as a track record of successfully selling data and analytics software to enterprise customers are key requirements. Your technical expertise should cover Big Data, streaming platforms, Databricks, Snowflake, Redshift, Spark, Kafka, SQL Server, PostgreSQL, and modern BI tools. Your ability to design and scale data pipelines and architectures in complex environments, along with excellent soft skills including leadership, client communication, and stakeholder management will be valuable assets in this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Lead/Senior Data Scientist at HiLabs, you will be a key player in leveraging AI/ML techniques to identify and interpret complex healthcare problems. You will be responsible for the full-stack development of data pipelines involving Big Data, designing and developing robust application/data pipelines using Python, Scala, Spark, and SQL. Additionally, you will lead a team of Data Scientists and developers to strategize, design, and evaluate AI-based solutions to healthcare problems, aiming to increase efficiency and improve the quality of solutions offered. The ideal candidate for this role should hold Bachelor's or Master's degrees in computer science, Mathematics, or any other quantitative discipline from Premium/Tier 1 institutions, along with 5 to 7 years of experience in developing robust ETL data pipelines and implementing advanced AI/ML algorithms. Strong experience with technologies like Python, Scala, Spark, Apache Solr, MySQL, Airflow, AWS, and relational databases is essential. Moreover, a good understanding of large system architecture and design, core concepts of Machine Learning, and experience working in AWS/Azure cloud environment are highly valued. Your responsibilities will include managing the complete ETL pipeline development process, collaborating with the team on writing, building, and deploying data software, and ensuring high-quality code through code reviews, testing, and debugging. You should be well-versed in using Version Control tools, continuous integration and delivery tools, and have experience working in an Agile software delivery environment. At HiLabs, we are an equal opportunity employer dedicated to fostering a diverse and inclusive workforce. We offer competitive salary, accelerated incentive policies, comprehensive benefits package, H1B sponsorship, ESOPs, medical coverage, 401k, PTOs, and a collaborative working environment. If you are motivated, skilled, and passionate about advancing AI/ML excellence and technology innovation in healthcare, we encourage you to apply for this exciting opportunity at HiLabs. Thank you for considering a career with HiLabs, where we are committed to transforming the healthcare industry through innovation and collaboration.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Product Manager at Google, you will be responsible for guiding products from conception to launch by connecting the technical and business worlds. Your primary focus will be on analyzing, positioning, packaging, promoting, and tailoring solutions to meet the needs of millions of users worldwide. You will work cross-functionally with various teams including engineering, marketing, legal, UX, and others to develop and launch innovative products and features. Your role will involve understanding the cloud ecosystem, market trends, competition, and user requirements to drive product development. To excel in this role, you should have a Bachelor's degree or equivalent practical experience, along with at least 5 years of experience in product management or a related technical role. Additionally, you should have a track record of taking technical products from conception to launch and experience in domains such as customer service or business applications. Preferred qualifications include a Master's degree in a technology or business-related field, as well as experience in generative AI Co-pilot, big data, security and privacy, development and operations, or machine learning. You should also possess the ability to influence multiple stakeholders without direct authority. Google Cloud aims to provide top-quality support and drive customer retention through positive support experiences. By leveraging cutting-edge technology and tools, Google Cloud helps organizations digitally transform and solve critical business problems. In this dynamic and collaborative environment, you will play a key role in developing solutions that optimize and automate customer support interactions. Your contributions will directly impact the efficiency, scale, and overall positive customer experiences within the Cloud ecosystem. If you are passionate about driving innovation, working on impactful products, and shaping the future of technology, this Product Manager role at Google offers an exciting opportunity to make a difference. Join us in our mission to deliver enterprise-grade solutions and help organizations around the world achieve their digital transformation goals.,
Posted 2 weeks ago
5.0 - 10.0 years
16 - 20 Lacs
Gurugram, Bengaluru
Work from Office
Role & responsibilities
Posted 2 weeks ago
0.0 - 5.0 years
2 - 7 Lacs
Chennai
Work from Office
Join our tech-driven team to build scalable software. Collaborate on innovative AI. Software Programmer Data Analytics (0-5 Years) Description We are looking for programmers with knowledge and experience in R, Python or SAS. Should have a solid understanding of machine learning and analytics algorithms. Knowledge of databases and big data environment is advantageous for this role.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Working at Atlassian Atlassians can choose where they work whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. ","responsibilities":" Atlassian is looking for a Data Engineer to join our Data Engineering team, responsible for building our data lake, maintaining big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with the business stakeholders, platform and engineering teams to enable growth and retention strategies at Atlassian. We are looking for an open-minded, structured thinker who is passionate about building services/pipelines that scale. On a typical day you will help our stakeholder teams ingest data faster into our data lake, you ll find ways to make our data pipelines more efficient, or even come up ideas to help instigate self-serve data engineering within the company. You will be involved in strategizing measurement, collecting data, and generating insights. ","qualifications":" Benefits & Perks Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Systems Engineer III - Big Data Platform Testing Back to job search results Tesco India Bengaluru, Karnataka, India Full-Time Permanent Apply by 22-Jul-2025 About the role Systems Engineer III - Big Data Testing What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles -simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Collaborate with product managers and developers to understand product requirements and contribute to solution design. Validate data sources, transformation logic, and data persistence in target systems. Develop and execute robust, data-centric test plans aligned with organizational goals. Design and implement scalable automation frameworks for data platforms, supporting a wide range of onboarded applications. Author detailed functional, integration, and regression test cases. Coordinate with stakeholders to review test plans, identify gaps, and ensure coverage of complex scenarios. Generate accurate and representative test data to simulate real-world platform operations. Work cross-functionally with engineering teams to ensure end-to-end test coverage and automation. Analyze server logs, databases, and other system components to provide detailed bug reports. Write and optimize SQL queries for test data extraction and validation. Design and enhance test automation frameworks to support evolving platform requirements. Track and report comprehensive quality metrics, including defect density, open and closed defect counts, test coverage, test execution pass/fail rates, mean time to detect (MTTD), mean time to resolve (MTTR), requirements coverage, and automation effectiveness. Develop mocks and stubs to facilitate isolated component testing. You will need Minimum of 12 years of experience in Quality Engineering, with a strong focus on data platform testing and validation. In-depth understanding of software QA methodologies, testing strategies, tools, and best practices. Proven ability to design and implement comprehensive test plans and automation strategies tailored to data-centric platforms. Strong hands-on experience in Python for scripting and developing test automation frameworks. Proficient in Microsoft Azure ecosystem, including services such as Azure Storage, Azure Databricks, Azure Data Factory, Azure SQL Database, and other SQL-based services. Strong understanding of data pipelines, ETL processes, and data warehousing concepts. Expertise in automation tools, scripting languages, and workflow orchestration platforms. Experience working in Linux/Unix environments for test execution and automation. Solid knowledge of SQL and NoSQL databases, as well as pub-sub messaging technologies (e.g., Kafka, Azure Event Hubs). Experience working in Agile/Scrum environments, with active participation in sprint planning, reviews, and retrospectives. Strong problem-solving abilities with a keen eye for troubleshooting and debugging complex data and integration issues. Demonstrated experience in large-scale integration programs, including third-party system/component integration. Excellent verbal and written communication skills, with the ability to articulate technical concepts to both technical and non-technical stakeholders. About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built. Apply
Posted 2 weeks ago
7.0 - 11.0 years
9 - 13 Lacs
Bengaluru
Work from Office
About Tekion: Positively disrupting an industry that has not seen any innovation in over 50 years, Tekion has challenged the paradigm with the first and fastest cloud-native automotive platform that includes the revolutionary Automotive Retail Cloud (ARC) for retailers, Automotive Enterprise Cloud (AEC) for manufacturers and other large automotive enterprises and Automotive Partner Cloud (APC) for technology and industry partners. Tekion connects the entire spectrum of the automotive retail ecosystem through one seamless platform. The transformative platform uses cutting-edge technology, big data, machine learning, and AI to seamlessly bring together OEMs, retailers/dealers and consumers. With its highly configurable integration and greater customer engagement capabilities, Tekion is enabling the best automotive retail experiences ever. Tekion employs close to 3,000 people across North America, Asia and Europe. Positively disrupting an industry that has not seen any innovation in over 50 years, Tekion has challenged the paradigm with the first and fastest cloud-native automotive platform. Our Automotive Retail Cloud (ARC), Automotive Enterprise Cloud (AEC), and Automotive Partner Cloud (APC) connect OEMs, retailers, and consumers through one seamless platform. Utilizing cutting-edge technology, big data, machine learning, and AI, Tekion is transforming the automotive retail ecosystem. Were inventing new technology along the way to overcome barriers and solve big problems, all while having a blast doing it with offices in North America, Asia, and Europe, Tekion employs around 3,000 people worldwide. Key Responsibilities Design and build solutions for complex business workflows Take end-to-end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Contribute to high-quality code and taking responsibility for the task Solve performance bottlenecks Provide mentorship to other engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Skills & Qualifications Bachelor/Masters in computer science or relevant fields Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Solid understanding of at least one Nosql databases Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality, and enforce code quality within the team Good Communication Skills Perks and Benefits Competitive compensation Generous stock options Medical Insurance coverage Work with some of the brightest minds from Silicon Valley s most dominant and successful companies
Posted 2 weeks ago
10.0 - 14.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Position Summary... What youll do... About Team: This is the team which builds reusable technologies that aid in acquiring customers, onboarding and empowering merchants besides ensuring a seamless experience for both these stakeholders. We also optimize tariffs and assortment, adhering to the Walmart philosophy - Everyday Low Cost. In addition to ushering in affordability, we also create personalized experiences for customers the omnichannel way, across all channels - in-store, on the mobile app and websites. Marketplace is the gateway to domestic and international Third-Party sellers; we enable them to manage their end-to-end onboarding, catalog management, order fulfilment, return ; refund management. Our team is responsible for design, development, and operations of large-scale distributed systems by leveraging cutting-edge technologies in web/mobile, cloud, big data ; AI/ML. We interact with multiple teams across the company to provide scalable robust technical solutions. What youll do: Drive architecture, design, development, operation and documentation of large scale services. Build, test and deploy cutting edge solutions at scale, impacting associates of Walmart worldwide. Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Engineers, Product Management and Business to drive the technical discoveries, design, set your priorities and deliver awesome products. Drive Proof-of-Concepts and Proof-of-Technology evaluations Drive the success of the implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost ; Delivery). Within the established architectural guidelines. Publish and update technical architecture and user/process documentation You will need to exhibit strong technical leadership and communication skills to collaborate with business, product, engineering ; management across different geographic locations. Guide and mentor other team members to promote highly technical and self-sufficient teams. What youll bring: Minimum qualifications: B.Tech. / B.E. / M.Tech. / M.S. in Computer Science 10 - 14 years of experience in design and development of highly-scalable applications and platform development in product based companies or R;D divisions. Strong computer science fundamentals: data structures, algorithms, design patterns. 7+ of years hands-on experience building applications using - Java, Springboot, Microservices, 5+ years of experience in systems design, algorithms, and distributed systems. Extensive years of experience in RDMBS/ No SQL databases, Enterprise Messaging Application (Kafka) and Big data (Spark) Hands-on experience in designing and implementing cloud-native Microservices Architecture and related stacks using containerization technologies like Docker, Kubernetes, etc. Microservices Architecture and related stacks and container technologies (Docker, Kubernetes, etc.) Well versed in TDD,BDD methodologies and the enabling tools and technologies - JUnit, Test NG, Cucumber, CI/CD etc. Practioner of Agile methodologies and Dev Ops CI/CD development environments/tools: Git, Maven, Jenkins Experience with performance testing tools e.g. Jmeter, Load Runner etc . Experience with Architectural patterns for High Availability, performance, Scale our Architecture, Disaster Recovery, Security Architecture. Exposure to cloud infrastructures, such as Open Stack, Azure, GCP, or AWS Strong desire to drive change, and ability to adapt to change quickly. Proficient in new and emerging technologies. Strong hands on development skills to prototype technical solutions. Must be a proven performer and team player who enjoys challenging assignment with high enery in fast growing environment. Strong engineering mindset who can drive design and development of automated monitoring, alerting and self-healing system About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. Thats what we do at Walmart Global Tech. Were a team of software engineers, data scientists, cybersecurity experts and service professionals within the worlds leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. . Flexible, hybrid work . Benefits . Belonging . At Walmart, our vision is everyone included. By fostering a workplace culture where everyone isand feelsincluded, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, were able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions while being welcoming of all people. Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with organizational goals, ensuring that the solutions provided are effective and efficient. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: - Good to have skills - Pyspark, AWS, Airflow, Databricks, SQL- Experience should be 6+ years in Python- Candidate must be a strong Hands-on senior Developer- As a lead, steer the team in completing their tasks and solve technical issues- Candidate must possess good technical / non-technical communication skills to highlight areas of concern/risks- Should have good troubleshooting skills to do RCA of prod support related issues Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 year full time education is required.- Candidate must be willing to work in Shift B i.e. from 11 AM IST to 9PM IST Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Navi Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data strategy and architecture. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud data services and data warehousing solutions.- Strong understanding of data integration techniques and ETL processes.- Familiarity with data governance and data quality frameworks.- Experience in working with big data technologies and frameworks. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Navi Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark, Python (Programming Language), AWS Architecture, Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark, Python (Programming Language), AWS Architecture, Apache Spark.- Good To Have Skills: Experience with data processing frameworks and cloud services.- Strong understanding of distributed computing principles.- Experience with data pipeline development and optimization.- Familiarity with containerization technologies such as Docker. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France