Jobs
Interviews

947 Olap Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

We are looking for Data Engineer with 2+years of experience to set-up, build, deploy and maintain scalable and reliable data assets and reporting/dashboarding solutions, ensuring compelling user experience and insights that drive real business value. Job Description In your new role you will: Set-up, build, deploy and maintain scalable and reliable data assets and reporting/dashboarding solutions, ensuring compelling user experience and insights that drive real business value. Improve speed and quality of data provisioning and data usage, e.g.by effectively combining data from different sources, across platforms and different formats. Ensure scalability, robustness and sustainable usage of your developed solutions. Team up with our domain-, IT- and methods experts to capture the full value of our data, e.g. by ensuring optimal usage of existing tools,resources and business needs and by working in tandem on complex analytics cases. Your Profile You are best equipped for this task if you have: A degree in Information Technology, Business Informatics, Computer Science or related field of studies. At least 1-3 years of relevant work experience related to Data Engineering, Data Visualization and/or Data Analytics. Solid expertise of database concepts (e.g. DWH, Hadoop/Big Data,OLAP, MS Access) and related query languages (e.g. SQL, MDX). Solid expertise in visualization of complex / big data, using Tableau and Tableau Prep. Ability to translate complex business needs into concrete actions. Ability to work both independently and within a team. Fluent in English. Contact: Shavin.shashidhar@infineon.com #WeAreIn for driving decarbonization and digitalization. As a global leader in semiconductor solutions in power systems and IoT, Infineon enables game-changing solutions for green and efficient energy, clean and safe mobility, as well as smart and secure IoT. Together, we drive innovation and customer success, while caring for our people and empowering them to reach ambitious goals. Be a part of making life easier, safer and greener. Are you in? We are on a journey to create the best Infineon for everyone. This means we embrace diversity and inclusion and welcome everyone for who they are. At Infineon, we offer a working environment characterized by trust, openness, respect and tolerance and are committed to give all applicants and employees equal opportunities. We base our recruiting decisions on the applicant´s experience and skills. Please let your recruiter know if they need to pay special attention to something in order to enable your participation in the interview process. Click here for more information about Diversity & Inclusion at Infineon.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Team: Enterprise Business Services is invested in building a compact, robust organization that includes service operations and technology solutions for Finance, People, Associate Digital Experience. Our team is responsible for design and development of solution that knows our consumer’s needs better than ever by predicting what they want based on unconstrained demand, and efficiently unlock strategic growth, economic profit, and wallet share by orchestrating intelligent, connected planning and decisioning across all functions. We interact with multiple teams across the company to provide scalable robust technical solutions. This role will play crucial role in overseeing the planning, execution and delivery of complex projects within team Walmart’s Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. What you'll do: Manage a high performing team of 8-10 engineers who work across multiple technology stacks including Java and Mainframe Drive design, development, implementation and documentation Establish best engineering and operational excellence practices based product, engineering and scrum metrics Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product and Business stakeholders to drive the agenda, set the priorities and deliver scalable and resilient products. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery) within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring, mentoring and building high performing agile teams. Participating in organizational events like hackathons, demodays etc. and be the catalyst towards the success of those events Interact closely for requirements with Business owners and technical teams both within India and across the globe.. What you'll bring: Bachelor's/Master’s degree in Computer Science, engineering, or related field, with minimum 12+ years of experience in software development and at least 5+ years of experience in managing engineering teams. Have prior experience in managing high performing agile technology teams. Hands on experience building Java-based backend systems is a must, and experience of working in cloud based solutions is desirable Proficient in Javascript, NodeJS, ReactJS and NextJS. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should have exposed to CI/CD development environments/tools including, but not limited to, Git, Maven, Jenkins. Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Posses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Good knowledge in Pub sub system like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Acronotics Limited specializes in building cutting-edge robotic automation and artificial intelligence solutions. We focus on modern-age automation technologies such as AI and RPA, aiming to transform how businesses operate. Our mission is to help clients design, develop, implement, and run transformative robotic automation and AI-based solutions. Our product, Radium AI, automates bot monitoring and support activities, delivering frictionless digital worker support driven by AI. Role Description This is a full-time, on-site role located in Bengaluru for a Business Analyst - Corporate Finance. We are looking for someone with a strong background in delivering IT projects for finance functions in large enterprises. You will act as a critical liaison between our client’s finance teams and our AI/technology teams to ensure that solutions are aligned with financial reporting requirements, KPIs, and business goals. This is a client-facing role requiring both domain expertise in corporate finance and strong business analysis skills. Responsibilities Work closely with client finance teams to gather, understand, and document financial processes, KPIs, and reporting needs. Translate business requirements into clear, structured documentation for AI engineers, data scientists, and solution architects. Facilitate workshops, interviews, and process walkthroughs with finance stakeholders (e.g., FP&A, Controlling, Tax, Treasury). Define and prioritise business requirements and user stories for AI and data-driven solutions. Analyse existing financial data sources (ERP systems like SAP, BI tools like Power BI, OLAP cubes) and propose integration/data flow approaches. Collaborate with AI/ML and software engineering teams to ensure accurate understanding of financial concepts and requirements. Support the validation, testing, and rollout of AI-based tools by ensuring they meet user expectations and financial accuracy standards. Qualifications 5+ years of experience as a Business Analyst, with at least 2+ years in finance-related IT projects . Strong understanding of corporate finance functions and KPIs (e.g., P&L, working capital, cost allocation, budgeting, forecasting). Experience working in an IT services, consulting, or digital transformation environment. Familiarity with ERP systems (e.g., SAP, Oracle), BI platforms (e.g., Power BI, Tableau), and financial data structures. Ability to create process maps, functional specifications, and business requirements documents. Strong stakeholder management and communication skills, especially with finance professionals and technical teams. Experience with Agile/Scrum methodologies is preferred. Exposure to AI/ML, analytics, or automation projects is a plus.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced individual with a background in B.E/ B.Tech/ MCA / M Sc (Mathematics, Statistics, Computer Applications) and additional qualifications such as Microsoft certifications in .NET 4.0, you will be an integral part of our team. A project management certification (e.g., PMP) would be a plus. With 8 years of hands-on experience in product management, architecture, design, and development of enterprise and SaaS applications, you will be responsible for driving the product lifecycle. Your role will involve designing, developing, and managing activities from product definition & planning to production and release. You will be tasked with developing the product strategy and roadmap, ensuring timely delivery, and maintaining a balance of workload among team members. In the domain of Commodity Markets & Risk Management, your expertise in .Net coding and team management will be highly valued. You will work closely with design and technology teams to ensure the timely and quality release of products and enhancements. Moreover, your proficiency in Agile methodologies, Continuous Integration/Development (CI/CD) processes, and cloud infrastructure will be essential in this role. As a key player in our team, you will be required to possess strong technical expertise, a deep understanding of programming concepts, and the ability to make technology decisions that enhance IT department performance. Your problem-solving skills, positive approach, and ability to manage customer expectations will contribute to the success of our projects. Overall, your role will involve supervising product development, guiding software development team members, making technology decisions, and actively contributing to the growth and success of our product business. Your passion for learning, dedication to staying updated on software development and technology trends, and strong communication skills will make you a valuable asset to our team.,

Posted 1 week ago

Apply

2.0 - 5.0 years

13 - 15 Lacs

Hyderabad

Work from Office

We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions What we expect of you Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Bachelor s degree and 2 to 5 + years of Computer Science, IT or related field experience OR Master s degree and 1 to 4 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

9.0 - 12.0 years

15 - 19 Lacs

Hyderabad

Work from Office

We are seeking a Data Solutions Architect with deep expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications 9 to 12 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Hyderabad

Work from Office

We are seeking an experienced Senior Manager, Data Engineering to lead and scale a strong team of data engineers. This role blends technical depth with strategic oversight and people leadership. The ideal candidate will oversee the execution of data engineering initiatives, collaborate with business analysts and multi-functional teams, manage resource capacity, and ensure delivery aligned to business priorities. In addition to technical competence, the candidate will be adept at managing agile operations and driving continuous improvement. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, leveraging AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Lead and manage a team of data engineers, ensuring appropriate workload distribution, goal alignment, and performance management. Work closely with business analysts and product collaborators to prioritize and align engineering output with business objectives. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered OR Diploma and 14 to 18 years of computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Strong communication skills for collaborating with business and technical teams alike. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certification: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

4.0 - 9.0 years

40 - 45 Lacs

Hyderabad

Work from Office

We are seeking a Senior Data Engineer with expertise in Graph Data technologies to join our data engineering team and contribute to the development of scalable, high-performance data pipelines and advanced data models that power next-generation applications and analytics. This role combines core data engineering skills with specialized knowledge in graph data structures, graph databases, and relationship-centric data modeling, enabling the organization to leverage connected data for deep insights, pattern detection, and advanced analytics use cases. The ideal candidate will have a strong background in data architecture, big data processing, and Graph technologies and will work closely with data scientists, analysts, architects, and business stakeholders to design and deliver graph-based data engineering solutions. Roles & Responsibilities: Design, build, and maintain robust data pipelines using Databricks (Spark, Delta Lake, PySpark) for complex graph data processing workflows. Own the implementation of graph-based data models, capturing complex relationships and hierarchies across domains. Build and optimize Graph Databases such as Stardog, Neo4j, Marklogic or similar to support query performance, scalability, and reliability. Implement graph query logic using SPARQL, Cypher, Gremlin, or GSQL, depending on platform requirements. Collaborate with data architects to integrate graph data with existing data lakes, warehouses, and lakehouse architectures. Work closely with data scientists and analysts to enable graph analytics, link analysis, recommendation systems, and fraud detection use cases. Develop metadata-driven pipelines and lineage tracking for graph and relational data processing. Ensure data quality, governance, and security standards are met across all graph data initiatives. Mentor junior engineers and contribute to data engineering best practices, especially around graph-centric patterns and technologies. Stay up to date with the latest developments in graph technology, graph ML, and network analytics. What we expect of you Must-Have Skills: Hands-on experience in Databricks, including PySpark, Delta Lake, and notebook-based development. Hands-on experience with graph database platforms such as Stardog, Neo4j, Marklogic etc. Strong understanding of graph theory, graph modeling, and traversal algorithms Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies with strong problem-solving and analytical skills Excellent collaboration and communication skills, with experience working with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Master s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor s degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way.

Posted 1 week ago

Apply

2.0 - 7.0 years

40 - 45 Lacs

Hyderabad

Work from Office

In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, build, and support data ingestion, transformation, and delivery pipelines across structured and unstructured sources within the enterprise data engineering. Manage and monitor day-to-day operations of the data engineering environment, ensuring high availability, performance, and data integrity. Collaborate with data architects, data governance, platform engineering, and business teams to support data integration use cases across R&D, Clinical, Regulatory, and Commercial functions. Integrate data from laboratory systems, clinical platforms, regulatory systems, and third-party data sources into enterprise data repositories. Implement and maintain metadata capture, data lineage, and data quality checks across pipelines to meet governance and compliance requirements. Support real-time and batch data flows using technologies such as Databricks, Kafka, Delta Lake, or similar. Work within GxP-aligned environments, ensuring compliance with data privacy, audit, and quality control standards. Partner with data stewards and business analysts to support self-service data access, reporting, and analytics enablement. Maintain operational documentation, runbooks, and process automation scripts for continuous improvement of data fabric operations. Participate in incident resolution and root cause analysis, ensuring timely and effective remediation of data pipeline issues. Create documentation, playbooks, and best practices for metadata ingestion, data lineage, and catalog usage. Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must have Skills: Build and maintain data pipelines to ingest and update metadata into enterprise data catalog platforms in biotech or life sciences or pharma. Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. experience in data engineering, data operations, or related roles, with at least 2+ years in life sciences, biotech, or pharmaceutical environments. Experience with cloud platforms (e.g., AWS, Azure, or GCP) for data pipeline and storage solutions. Understanding of data governance frameworks, metadata management, and data lineage tracking. Strong problem-solving skills, attention to detail, and ability to manage multiple priorities in a dynamic environment. Effective communication and collaboration skills to work across technical and business stakeholders. Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Basic Qualifications: Master s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor s degree and 5 to 8 + years of Computer Science, IT or related field experience Diploma and 7 to 9 years of Computer Science, IT or related field experience Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills : Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

8.0 - 10.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Let s do this. Let s change the world. We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered; Diploma and 14 to 18 years of in computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

2.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Let s do this. Let s change the world. In this vital role you will drive the development and implementation of our data strategy. The ideal candidate possesses a strong blend of technical expertise and data-driven problem-solving skills. As a Senior Data Engineer, you will play a crucial role in designing, building, and optimizing our data pipelines and platforms while mentoring junior engineers. Roles & Responsibilities: Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, managing scope, timelines, and risks. Ensure data quality and integrity through rigorous testing and monitoring. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Work closely with data analysts, data scientists, and business collaborators to understand data requirements. Identify and resolve complex data-related challenges. Adhere to data engineering best practices and standards. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Familiarity with code versioning using GIT, Jenkins and code migration tools. Exposure to Jira or Rally. Identifying and implementing opportunities for automation and CI/CD. Stay up to date with the latest data technologies and trends. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 2 years of Computer Science, IT or related field experience OR Master s degree and 8 to 10 years of Computer Science, IT or related field experience OR Bachelor s degree and 10 to 14 years of Computer Science, IT or related field experience OR Diploma and 14 to 18 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills (Not more than 3 to 4): Demonstrated hands-on experience with cloud platforms (AWS, Azure, GCP) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Hands on experience with big data ETL performance tuning. Strong development knowledge in Databricks. Strong analytical and problem-solving skills to address complex data challenges. Good-to-Have Skills: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced working with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experience in SQL/NOSQL database, vector database for large language models Experience with prompt engineering, model fine tuning Experience with DevOps/MLOps CICD build and deployment pipeline Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Data Engineer (preferred) Databricks Certification (preferred) Any SAFe Agile certification (preferred) Soft Skills: Initiative to explore alternate technology and approaches to solving problems. Skilled in breaking down problems, documenting problem statements, and estimating efforts. Effective communication and interpersonal skills to collaborate with multi-functional teams. Excellent analytical and solving skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

Hyderabad

Work from Office

We are seeking a Data Solutions Architect to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support enterprise data lakes, data warehouses, and real-time analytics. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 6-8 + years of experience in Computer Science, IT or related field OR Master s degree with 8-10 + years of experience in Computer Science, IT or related field OR Bachelor s degree with 10-12 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. .

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Hyderabad

Work from Office

We are seeking a Data Solutions Architect with deep R&D expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with R&D and engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support R&D initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in R&D domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 3-5 + years of experience in Computer Science, IT or related field OR Master s degree with 6 - 8 + years of experience in Computer Science, IT or related field OR Bachelor s degree with 8-10 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. .

Posted 1 week ago

Apply

12.0 years

27 - 35 Lacs

Madurai, Tamil Nadu, India

On-site

Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership Skills:- Google Cloud Platform (GCP), GCP Data, Architect and Data architecture

Posted 1 week ago

Apply

5.0 - 12.0 years

25 - 30 Lacs

Kolkata

Work from Office

Jul 18, 2025 Location: Kolkata Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team As a member of the Operation, Industry and domain solutions team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals. We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change. Your work profile Business analyst shall be a graduate in any field with post graduate qualification in business administration or equivalent. Experience in Crafting and executing queries upon request for data, Presenting information through reports and visualization, In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experienced in study methods for system and functional requirement specification study. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

1.0 - 9.0 years

14 - 16 Lacs

Kolkata

Work from Office

Location: Kolkata Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team As a member of the Operation, Industry and domain solutions team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals. We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change. Your work profile Business analyst shall be a graduate in any field with post graduate qualification in business administration or equivalent. Experience in Crafting and executing queries upon request for data, Presenting information through reports and visualization, In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experienced in study methods for system and functional requirement specification study. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Greater Chennai Area

On-site

Job Title: Architect - Data Modeling Location: Bengaluru, Chennai, Hyderabad. Who We Are Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow. Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence. We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs What do we expect? 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, let’s stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 1 week ago

Apply

6.0 years

0 Lacs

Greater Chennai Area

On-site

Job Title: Senior Data Engineer-Data Modeling Location: Bengaluru, Chennai, Hyderabad. Who We Are Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow. Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence. We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Contribute to Data Modeling accelerators Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Use the Data Modelling tool to create appropriate data models. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers to strategize ingestion logic and consumption patterns. Expertise And Qualifications What do we expect? 6+ years of experience in Data space. Decent SQL skills. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, letʼs stay connected! Every individual comes with a different set of skills and qualities so even if you donʼt tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 1 week ago

Apply

7.0 years

5 - 7 Lacs

Thiruvananthapuram

On-site

7 - 9 Years 1 Opening Kochi, Trivandrum Role description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes: Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation requirements test cases and results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project: Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort and size estimation and plan resources for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface with Customer: Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team: Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications: Obtain relevant domain and technology certifications Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples: Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments: • Over 8 + years of experience in developing BI applications utilizing SQL server/ SF/ GCP/ PostgreSQL, BI stack, Power BI, and Tableau. • Practical understanding of the Data modelling (Dimensional & Relational) concepts like Star-Schema Modelling, Snowflake Schema Modelling, Fact and Dimension tables. • Ability to translate the business requirements into workable functional and non-functional requirements. • Capable of taking ownership and communicating with C Suite executives & Stakeholders. • Extensive database programming experience in writing T-SQL, User Defined Functions, Triggers, Views, Temporary Tables Constraints, and Indexes using various DDL and DML commands. • Experienced in creating SSAS based OLAP Cubes and writing complex DAX. • Ability to work with external tools like Tabular Editor and DAX Studio. • Understand complex and customize Stored Procedures and Queries for implementing business logic and process in backend, for data extraction. • Hands on experience in Incremental refresh, RLS, Parameterization, Dataflows and Gateways. • Experience in Design, development of Business Intelligence Solutions using SSRS and Power BI • Experience in optimization of PBI reports implementing Mixed and Direct Query modes. Skills Power Bi,Power Tools,Data Analysis About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

15.0 years

5 - 10 Lacs

Thiruvananthapuram

On-site

15 - 25 Years 1 Opening Kochi, Trivandrum Role description Data Architect | Power BI Preferred: Trivandrum / Cochin Other Locations: Chennai, Bangalore Additional Comments Over 10+ years of experience in developing BI applications using SQL Server, Salesforce (SF), GCP, PostgreSQL, BI stack, Power BI, and Tableau. Strong practical understanding of data modeling concepts—both Dimensional (Star Schema, Snowflake Schema) and Relational , including Fact and Dimension tables. Skilled in translating business requirements into detailed functional and non-functional specifications . Capable of taking ownership of deliverables and effectively communicating with C-suite executives and stakeholders . Extensive database programming experience in T-SQL, including User Defined Functions, Triggers, Views, Temporary Tables, Constraints, and Indexes , using various DDL and DML operations. Proficient in building SSAS-based OLAP Cubes and writing complex DAX expressions. Hands-on expertise with external tools such as Tabular Editor and DAX Studio . Strong ability to customize and optimize stored procedures and backend queries for complex data extraction and business logic. Practical experience with Incremental Refresh, Row-Level Security (RLS), Parameterization, Dataflows, and Gateways . Proven experience in designing and developing BI solutions using SSRS and Power BI . Skilled in optimizing Power BI reports using Mixed Mode and DirectQuery . Skills Power BI Power Tools Data Analysis Microsoft Fabric Skills Power Bi,Power Tools,Data Analysis,Fabric About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

10.0 years

1 - 5 Lacs

Hyderābād

On-site

Category Application Development and Support Location Hyderabad, Telangana Job family Architecture Shift Evening Employee type Regular Full-Time Bachelor’s in computer science, Information Systems, or a related field Minimum of 10+ years of experience in data architecture with a minimum of 1-3 years of experience in healthcare domain Strong hands-on experience with Cloud databases such as Snowflake, Aurora, Google BigQuery etc. Experience in designing OLAP and OLTP systems for efficient data analysis and processing. Strong handson experience with enterprise BI/Reporting tools like (Looker, AWS QuickSight, PowerBI, Tableau and Cognos). A strong understanding of HIPAA regulations and healthcare data privacy laws is a must-have for this role, as the healthcare domain requires strict adherence to data privacy and security regulations. Experience in data privacy and tokenization tools like Immuta, Privacera, Privitar OpenText and Protegrity. Experience with multiple full life-cycle data warehouse/transformation implementations in the public cloud (AWS, Azure, and GCP) with Deep technical knowledge in one. Proven experience working as an Enterprise Data Architect or a similar role, preferably in large-scale organizations. Proficient in Data modelling (Star Schema (de-normalized data model), Transactional Model (Normalized data model) using tools like Erwin. Experience with ETL/ETL architecture and integration (Matillion, AWS GLUE, Google PLEX, Azure Data Factory etc) Deep understanding of data architectures that utilize Data Fabric, Data Mesh, and Data Products implementation. Business & financial acumen to advise on product planning, conduct research & analysis, and identify the business value of new and emerging technologies. Strong SQL and database skills working with large structured and unstructured data. Experienced in Implementation of data virtualization, and semantic model driven architecture. System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira Excellent written and oral communication skills to convey key choices, recommendations, and technology concepts to technical and non-technical audiences. Familiarity with AI/MLOps concepts and Generative AI technology. View more

Posted 1 week ago

Apply

0 years

1 - 6 Lacs

Noida

On-site

As Python Developer, you'll contribute to the core analytics engine that powers portfolio scoring and optimization. If you're strong in Python and love working with numbers, dataframes, and algorithms—this is the role for you. Key Responsibilities - Build and maintain internal Python libraries for scoring, data ingestion, normalization, and transformation. - Collaborate on the core computation engine, dealing with Return, Safety, Income scoring algorithms. - Process broker portfolio data and validate datasets using clean, modular Python code. - Contribute to writing tests, CI scripts, and engine documentation. - Participate in internal design discussions and implement enhancements to existing libraries and engines. - Financial data set like factser and bloomberg to build quant model for USA market Must-Have Skills - Proficiency in Python 3.x, with attention to clean, testable code. - Hands-on experience with NumPy and pandas for numerical/data processing. - Understanding of basic ETL concepts and working with structured datasets. - Familiarity with Git, unit testing, logging, and code organization principles. - Strong problem-solving ability and willingness to learn new technologies fast. Good to Have (Bonus Points) - Experience with SQL and writing efficient queries for analytics use-cases. - Exposure to ClickHouse or other columnar databases optimized for fast OLAP workloads. - Familiarity with data validation tools like pydantic or type systems like mypy. - Knowledge of Python packaging tools (setuptools, pyproject.toml). - Experience with Apache Arrow, Polars, FastAPI, or SQLAlchemy. - Exposure to async processing (Celery, asyncio), Docker, or Kubernetes. What You’ll Gain - Work on the core decision-making engine that directly impacts investor outcomes. - Be part of a small, high-quality engineering team focused on clean, impactful systems. - Mentorship and learning in data engineering, financial analytics, and production-quality backend systems. - Growth path into specialized tracks: backend, data engineering, or system architecture. About the Stack While your main focus will be Python, you'll also interact with services that consume or publish to APIs, data stores like ClickHouse and PostgreSQL, and real-time processing queues. Job Type: Full-time Pay: ₹10,000.00 - ₹50,000.00 per month Location Type: In-person Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Acronotics Limited specializes in cutting-edge robotic automation and artificial intelligence solutions. By applying human intelligence to build advanced AI-fueled systems, Acronotics transforms businesses with technologies like AI and Robotic Process Automation (RPA). As a consulting and services firm, we are dedicated to creating automated solutions that will redefine how products are made, sold, and consumed. Our mission is to help clients implement and run game-changing robotic automation and artificial intelligence-based solutions. Explore our product, Radium AI, which automates bot monitoring and support activities, on our website: Radium AI. Role Description This is a full-time, on-site role for a Data Engineer (Power BI) based in Bengaluru. You will design and manage data pipelines that connect Power BI, OLAP cubes, documents (pdfs, presentations) and external data sources to Azure AI. Your role ensures structured and unstructured financial data is indexed and accessible for semantic search and LLM use. Key Responsibilities: Extract data from Power BI datasets, semantic models, and OLAP cubes. Connect and transform data via Azure Synapse, Data Factory, and Lakehouse architecture. Preprocess PDFs, PPTs, and Excel files using Azure Form Recognizer or Python-based tools. Design data ingestion pipelines for external web sources (e.g., commodity prices). Coordinate with AI engineers to feed cleaned and contextual data into vector indexes. Requirements: Strong experience with Power BI REST/XMLA APIs. Expertise in OLAP systems (SSAS, SAP BW), data modelling, and ETL design. Hands-on experience with Azure Data Factory, Synapse, or Data Lake. Familiarity with JSON, DAX, M queries.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 12 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities: A. Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposal B. Discuss specific Big data architecture and related issues with client architect/team (in area of expertise) C. Analyze and assess the impact of the requirements on the data and its lifecycle D. Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture. E. Breadth of experience in various client scenarios and situations F. Experienced in Big Data Architecture-based sales and delivery G. Thought leadership and innovation H. Lead creation of new data assets & offerings I. Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A. Strong experience in Azure is preferred with hands-on experience in two or more of these skills : Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis Services B. Experience in one or more Real-time/Streaming technologies including: Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc. C. Experience in handling medium to large Big Data implementations D. Candidate must have around 5 years of extensive Big data experience E. Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information: A. Should be able to drive the technology design meetings, propose technology design and architecture B. Should have excellent client communication skills C. Should have good analytical and problem-solving skills

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 15 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities: A. Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposal B. Discuss specific Big data architecture and related issues with client architect/team (in area of expertise) C. Analyze and assess the impact of the requirements on the data and its lifecycle D. Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture. E. Breadth of experience in various client scenarios and situations F. Experienced in Big Data Architecture-based sales and delivery G. Thought leadership and innovation H. Lead creation of new data assets & offerings I. Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A. Strong experience in Azure is preferred with hands-on experience in two or more of these skills : Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis Services B. Experience in one or more Real-time/Streaming technologies including: Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc. C. Experience in handling medium to large Big Data implementations D. Candidate must have around 5 years of extensive Big data experience E. Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information: A. Should be able to drive the technology design meetings, propose technology design and architecture B. Should have excellent client communication skills C. Should have good analytical and problem-solving skills

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies