Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Hybrid
About the Team The Data Platform team is responsible for the foundational data services, systems, and data products for Okta that benefit our users. Today, the Data Platform team solves challenges and enables: Streaming analytics Interactive end-user reporting Data and ML platform for Okta to scale Telemetry of our products and data Our elite team is fast, creative and flexible. We encourage ownership. We expect great things from our engineers and reward them with stimulating new projects, new technologies and the chance to have significant equity in a company. Okta is about to change the cloud computing landscape forever. About the Position This is an opportunity for experienced Software Engineers to join our fast growing Data Platform organization that is passionate about scaling high volume, low-latency, distributed data-platform services & data products. In this role, you will get to work with engineers throughout the organization to build foundational infrastructure that allows Okta to scale for years to come. As a member of the Data Platform team, you will be responsible for designing, building, and deploying the systems that power our data analytics and ML. Our analytics infrastructure stack sits on top of many modern technologies, including Kinesis, Flink, ElasticSearch, and Snowflake. We are looking for experienced Software Engineers who can help design and own the building, deploying and optimizing the streaming infrastructure. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security and to expand that core-competency across the rest of engineering. You will have a sizable impact on the direction, design & implementation of the solutions to these problems. Job Duties and Responsibilities: Design, implement and own data-intensive, high-performance, scalable platform components Work with engineering teams, architects and cross functional partners on the development of projects, design, and implementation Conduct and participate in design reviews, code reviews, analysis, and performance tuning Coach and mentor engineers to help scale up the engineering organization Debug production issues across services and multiple levels of the stack Required Knowledge, Skills, and Abilities: 5+ years of experience in object-oriented language, preferably Java Hands-on experience using a cloud-based distributed computing technologies including Messaging systems such as Kinesis, Kafka Data processing systems like Flink, Spark, Beam Storage & Compute systems such as Snowflake, Hadoop Coordinators and schedulers like the ones in Kubernetes, Hadoop, Mesos Experience in developing and tuning highly scalable distributed systems Excellent grasp of software engineering principles Solid understanding of multithreading, garbage collection and memory management Experience with reliability engineering specifically in areas such as data quality, data observability and incident management Nice to have Maintained security, encryption, identity management, or authentication infrastructure Leveraged major public cloud providers to build mission-critical, high volume services Hands-on experience in developing Data Integration applications for large scale (petabyte scale) environments with experience in both batch and online systems. Contributed to the development of distributed systems or used one or more at high volume or criticality such as Kafka or Hadoop Experience developing Kubernetes based services on AWS Stack.
Posted 2 weeks ago
2.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
We are looking for experienced Software Engineers who can help design and own the building, deploying and optimizing the streaming infrastructure. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security and to expand that core-competency across the rest of engineering. You will have a sizable impact on the direction, design & implementation of the solutions to these problems. Job Duties and Responsibilities: Design, implement and own data-intensive, high-performance, scalable platform components Work with engineering teams, architects and cross functional partners on the development of projects, design, and implementation Conduct and participate in design reviews, code reviews, analysis, and performance tuning Coach and mentor engineers to help scale up the engineering organization Debug production issues across services and multiple levels of the stack Required Knowledge, Skills, and Abilities: 2+ years of experience of software development Proficient in at least one language while comfortable in more than one of the backend languages, preferably Java or Typescript, Ruby, GoLang, Python. Have experience working with at least one of the database technologies - MySQL, Redis, or PostgreSQL. Demonstrable knowledge of computer science fundamentals with strong API Design skills. Comfortable working on a geographically distributed extended team. Brings the right attitude to the team: ownership, accountability, attention to detail, and customer focus. Track record of delivering work incrementally to get feedback and iterating over solutions. Comfortable in React or similar front-end UI stacks; if not comfortable yet, you are willing to learn Nice to have Experience using a cloud-based distributed computing technologies such as Messaging systems such as Kinesis, Kafka Data processing systems like Flink, Spark, Beam Storage & Compute systems such as Snowflake, Hadoop Coordinators and schedulers like the ones in Kubernetes, Hadoop, Mesos Maintained security, encryption, identity management, or authentication infrastructure Leveraged major public cloud providers to build mission-critical, high volume services Hands-on experience in developing Data Integration applications for large scale (petabyte scale) environments with experience in both batch and online systems. Contributed to the development of distributed systems or used one or more at high volume or criticality such as Kafka or Hadoop
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
What you will do Role Description: In this vital role, you will develop data & analytics capabilities for the Global Quality organization. In this role, you will lead technical delivery as part of a team of data engineers and software engineers. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery for Amgen. You will drive the software engineering side of the product release and will deliver for outcomes. Roles & Responsibilities: Lead delivery of overall product and product features from concept to end of life management of the product team, comprised of engineers, product owners, and data scientists to ensure that business, quality, and functional goals are met with each product release Drives excellence and quality for product releases, collaborating with partner teams Hands-on experience with rapid prototyping and can quickly translate concepts into working code Provide technical guidance and mentorship to junior developers Incorporate and prioritize feature requests into product roadmap; Able to translate roadmap into execution Design and implement usability, quality, and delivery of a product or feature Plan releases and upgrades with no impacts to business Hands on expertise in driving quality and best in class Agile engineering practices Encourage and motivate the product team to deliver innovative and exciting solutions with an appropriate sense of urgency Manages progress of work and addresses production issues during sprints Communication with partners to make sure goals are clear and the vision is aligned with business objectives Direct management and staff development of team members What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree in computer science or STEM major with a minimum of 5 years of Information Systems experience OR Bachelors degree in computer science or STEM major with a minimum of 7 years of Information Systems experience. Must-Have Skills: Experience in people management and leading matrixed teams, and passion for mentorship, culture and fostering the development of talent. Thorough understanding of modern web application development and delivery, Gen AI applications development, data integration and enterprise data fabric concepts, methodologies, and technologies (e.g. AWS technologies, Databricks) Prior hands-on experience with full-stack development, including back-end web and APIs using infrastructure cloud services (AWS preferred) and cloud-native tools and design patterns (Containers, Serverless, Docker, etc). Demonstrated experience in navigating matrix organization and leading change. Define success metrics for developer productivity metrics; on a monthly/quarterly basis analyze how the product team is performing against established KPIs. Preferred Qualifications: Strong influencing skills, influence stakeholders and be able to balance priorities Prior experience in vendor management Prior experience in biotechnology or pharma industry Professional Certifications: AWS Certified Solutions Architect (preferred) Certified DevOps Engineer (preferred) Certified Agile Leader or similar (preferred) Soft Skills: Strong desire for continuous learning to pick new tools/technologies. High attention to detail is essential with critical thinking ability Should be an active contributor to technological communities/forums Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Influences and energizes others toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable Established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Excellent organizational and time-management skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.
Posted 2 weeks ago
5.0 - 10.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and business intelligence solutions. We are committed to helping organizations excel through effective data management. Our innovative team focuses on delivering impactful business insights, and we are looking for talented individuals to join us on this journey. Job Summary: We are seeking an experienced ETL Developer with expertise in Oracle Data Integrator (ODI) to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to extract data from multiple sources and transform it into a format suitable for analysis. You will work closely with business analysts, data architects, and other stakeholders to ensure the successful implementation of data integration solutions. Key Responsibilities: - Design and implement ETL processes using Oracle Data Integrator (ODI) to support data warehousing and business intelligence initiatives. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop, test, and optimize ETL workflows, mappings, and packages to ensure efficient data loading and processing. - Perform data quality checks and validations to ensure the accuracy and reliability of transformed data. - Monitor and troubleshoot ETL processes to resolve issues and ensure timely delivery of data. - Document ETL processes, technical specifications, and any relevant workflows. - Stay up-to-date with industry best practices and technology trends related to ETL and data integration. Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as an ETL Developer with a focus on Oracle Data Integrator (ODI). - Strong understanding of ETL concepts, data warehousing, and data modeling. - Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or others. - Familiarity with data integration tools and techniques, including data profiling, cleansing, and transformation. - Experience in performance tuning and optimization of ETL processes. - Excellent analytical and problem-solving skills. - Strong communication and teamwork abilities, with a commitment to delivering high-quality results. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and career advancement. - A collaborative and innovative work environment. - The chance to work on exciting projects with leading organizations across various industries. KPI Partners is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 2 weeks ago
6.0 - 10.0 years
5 - 9 Lacs
Greater Noida
Work from Office
Design, develop, and maintain high-performance SQL and PL/SQL procedures, packages, and functions in Snowflake or other cloud database technologies. Apply advanced performance tuning techniques to optimize database objects, queries, indexing strategies, and resource usage. Develop code based on reading and understanding business and functional requirements following the Agile process Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members Provide technical support to project team members and responding to inquiries regarding errors or questions about programs Interact with architects, technical leads, team members and project managers as required to address technical and schedule issues. Suggest and implement process improvements for estimating, development and testing processes. Support the development of automated and repeatable processes for ETL/ELT, data integration, and data transformation using industry best practices. Support cloud migration and modernization initiatives, including re-platforming or refactoring legacy database objects for cloud-native platforms. BS Degree in Computer Science, Information Technology, Electrical/Electronic Engineering or another related field or equivalent A minimum of 7 years prior work experience working with an application and database development organization with deep expertise in Oracle PL/SQL or SQL Server T-SQL; must demonstrate experience delivering systems and projects from inception through implementation Proven experience writing and optimizing complex stored procedures, functions, and packages in relational databases such as Oracle, MySQL, SQL Server Strong knowledge of performance tuning, including query optimization, indexing, statistics, execution plans, and partitioning Understanding of data integration pipelines, ETL tools, and batch processing techniques. Possesses solid software development and programming skills, with an understanding of design patterns, and software development best practices Experience with Snowflake, Python scripting, and data transformation frameworks like dbt is a plus Work experience in developing Web Applications with Java, Java Script, HTML, JSPs. Experience with MVC frameworks Spring and Angular
Posted 2 weeks ago
2.0 - 6.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Design, develop, and maintain high-performance SQL and PL/SQL procedures, packages, and functions in Snowflake or other cloud database technologies. Apply advanced performance tuning techniques to optimize database objects, queries, indexing strategies, and resource usage. ; Develop code based on reading and understanding business and functional requirements following the Agile process Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements ; Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members ; Provide technical support to project team members and responding to inquiries regarding errors or questions about programs Interact with architects, technical leads, team members and project managers as required to address technical and schedule issues. ; Suggest and implement process improvements for estimating, development and testing processes. Support the development of automated and repeatable processes for ETL/ELT, data integration, and data transformation using industry best practices. ; Support cloud migration and modernization initiatives, including re-platforming or refactoring legacy database objects for cloud-native platforms. ; BS Degree in Computer Science, Information Technology, Electrical/Electronic Engineering or another related field or equivalent ; A minimum of 7 years prior work experience working with an application and database development organization with deep expertise in Oracle PL/SQL or SQL Server T-SQL; must demonstrate experience delivering systems and projects from inception through implementation ; Proven experience writing and optimizing complex stored procedures, functions, and packages in relational databases such as Oracle, MySQL, SQL Server ; Strong knowledge of performance tuning, including query optimization, indexing, statistics, execution plans, and partitioning ; Understanding of data integration pipelines, ETL tools, and batch processing techniques. ; Possesses solid software development and programming skills, with an understanding of design patterns, and software development best practices ; Experience with Snowflake, Python scripting, and data transformation frameworks like dbt is a plus ; Work experience in developing Web Applications with Java, Java Script, HTML, JSPs. Experience with MVC frameworks Spring and Angular
Posted 2 weeks ago
6.0 - 9.0 years
13 - 17 Lacs
Hyderabad
Work from Office
At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies, reaching over 10 million patients worldwide. Become the professional you are meant to be in this important role. The Financial Insights & Technology (FIT) team was created to: (1) maintain and build upon work/improvements enabled by various initiatives, (2) implement standardized reporting capabilities, (3) implement and maintain policies, processes and systems needed to drive an efficient and effective reporting environment and (4) explore new ways to evolve reporting, in the spirit of continuous improvement, to improve financial insights across the global finance organization. The Data Analytics Manager will be part of the FIT (Financial Insights & Technology) Data Analytics & Processes team within Corporate Finance. This role will be based in India-Hyderabad. What will you do In this vital role as a Data Analytics Manager you will play a meaningful role in fully understanding financial data and associated systems architecture in order to design data integrations for reporting and analysis to support the global Amgen organization. Key responsibilities include but are not limited to: Developing a robust understanding of Amgen’s financial data and systems in order to support data requests and integrations for different initiatives Working in close partnership with clients or team members to design, develop and augment the financial datasets Providing client-facing project management support and completing hands-on Databricks/Prophecy development as well as Power BI or Tableau as time and priorities allow Designing and developing the underlying ETL data processes used to build various financial datasets used for reporting and/or dashboards Identifying data enhancements or process improvements to optimize the financial datasets and processes Understanding the regularly scheduled financial datasets in Databricks and Tableau or Power BI dashboards refresh processes on an ongoing basis Involvement in the Financial Data Product Team, as a finance data subject matter expert. Key elements to success in this role include understanding Amgen’s financial systems and data, ability to define business requirements, and understanding how to design datasets compatible with Power BI, Tableau or other analytic tool reporting requirements. What we expect of you We are all different, yet we all use our outstanding contributions to serve patients. The Data Analytics Manager professional we seek is a go-getter with these qualifications. Basic Qualifications Doctorate degree Or Master’s degree and 2 years of Finance experience Or Bachelor’s degree and 4 years of Finance experience Or Associate’s degree and 10 years of Finance experience Or High school diploma / GED and 12 years of Finance experience Preferred Qualifications Experience performing data analysis across one or more areas of the business to derive business logic for data integration Experience working with business partners to identify complex functionality and translate it into requirements Experience with financial statements and Amgen Finance experience preferred Experience with data analysis, data modeling, and data visualization solutions such as Power BI, Tableau, Databricks, and Alteryx Familiar with Hyperion Planning, SAP, scripting languages like SQL or Python, Databricks Prophecy, and AWS services like S3 Able to work in matrixed teams, across geographic and functional reporting lines Excellent analytical and problem-solving skills Excellent facilitation, influencing, and negotiation skills Proficient in MS Office Suite
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking a highly experienced and hands-on Test Automation Engineering Manager with strong leadership skills and deep expertise in Data Integration, Data Quality , and automated data validation across real-time and batch pipelines . In this strategic role, you will lead the design, development, and implementation of scalable test automation frameworks that validate data ingestion, transformation, and delivery across diverse sources into AWS-based analytics platforms , leveraging technologies like Databricks , PySpark , and cloud-native services. As a lead , you will drive the overall testing strategy, lead a team of test engineers, and collaborate cross-functionally with data engineering, platform, and product teams. Your focus will be on delivering high-confidence, production-grade data pipelines with built-in validation layers that support enterprise analytics, ML models, and reporting platforms. The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Define and drive the test automation strategy for data pipelines, ensuring alignment with enterprise data platform goals. Lead and mentor a team of data QA/test engineers, providing technical direction, career development, and performance feedback. Own delivery of automated data validation frameworks across real-time and batch data pipelines using Databricks and AWS services. Collaborate with data engineering, platform, and product teams to embed data quality checks and testability into pipeline design. Design and implement scalable validation frameworks for data ingestion, transformation, and consumption layers. Automate validations for multiple data formats including JSON, CSV, Parquet, and other structured/semi-structured file types during ingestion and transformation. Automate data testing workflows for pipelines built on Databricks/Spark, integrated with AWS services like S3, Glue, Athena, and Redshift. Establish reusable test components for schema validation, null checks, deduplication, threshold rules, and transformation logic. Integrate validation processes with CI/CD pipelines, enabling automated and event-driven testing across the development lifecycle. Drive the selection and adoption of tools/frameworks that improve automation, scalability, and test efficiency. Oversee testing of data visualizations in Tableau, Power BI, or custom dashboards, ensuring backend accuracy via UI and data-layer validations. Ensure accuracy of API-driven data services, managing functional and regression testing via Postman, Python, or other automation tools. Track test coverage, quality metrics, and defect trends, providing regular reporting to leadership and ensuring continuous improvement. establishing alerting and reporting mechanisms for test failures, data anomalies, and governance violations. Contribute to system architecture and design discussions, bringing a strong quality and testability lens early into the development lifecycle. Lead test automation initiatives by implementing best practices and scalable frameworks, embedding test suites into CI/CD pipelines to enable automated, continuous validation of data workflows, catalog changes, and visualization updates Mentor and guide QA engineers, fostering a collaborative, growth-oriented culture focused on continuous learning and technical excellence. Collaborate cross-functionally with product managers, developers, and DevOps to align quality efforts with business goals and release timelines. Conduct code reviews, test plan reviews, and pair-testing sessions to ensure team-level consistency and high-quality standards. Must-Have Skills: Hands-on experience with Databricks and Apache Spark for building and validating scalable data pipelines Strong expertise in AWS services including S3, Glue, Athena, Redshift, and Lake Formation Proficient in Python, PySpark, and SQL for developing test automation and validation logic Experience validating data from various file formats such as JSON, CSV, Parquet, and Avro In-depth understanding of data integration workflows including batch and real-time (streaming) pipelines Strong ability to define and automate data quality checks : schema validation, null checks, duplicates, thresholds, and transformation validation Experience designing modular, reusable automation frameworks for large-scale data validation Skilled in integrating tests with CI/CD tools like GitHub Actions , Jenkins , or Azure DevOps Familiarity with orchestration tools such as Apache Airflow , Databricks Jobs , or AWS Step Functions Hands-on experience with API testing using Postman , pytest , or custom automation scripts Proven track record of leading and mentoring QA/test engineering teams Ability to define and own test automation strategy and roadmap for data platforms Strong collaboration skills to work with engineering, product, and data teams Excellent communication skills for presenting test results, quality metrics , and project health to leadership Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Good-to-Have Skills: Experience with data governance tools such as Apache Atlas, Collibra, or Alation Understanding of DataOps methodologies and practices Familiarity with monitoring/observability tools such as Datadog, Prometheus, or CloudWatch Experience building or maintainingtest data generators Education and Professional Certifications Bachelors/Masters degree in computer science and engineering preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.
Posted 2 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Strong expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Strong proficiency in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Extensive experience in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 2 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Basic Qualifications: Expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Proficient in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Expertise in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 2 weeks ago
3.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
3.0 - 8.0 years
1 - 4 Lacs
Hyderabad
Work from Office
Role Description: The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications . This hands-on technologist is an expert in utilizing the Microsoft Fabric platform to deliver the front end of the Data Catalyst solutions , providing overall guidance on platform capabilities, standards and techniques to the team, and to the wider Amgen technical community. Roles & Responsibilities: Oversee and manage the implementation, operations, and performance of Microsoft Fabric and Power BI solutions. Develop, standardize, and enforce operational processes and best practices for Power BI development and reporting across the organization. Collaborate with cross-functional teams to ensure seamless integration and performance of data pipelines, reports, and dashboards in Power BI. Monitor and optimize the performance of Power BI reports and dashboards, ensuring high availability and responsiveness. Ensure governance, data security, and compliance standards are maintained across Power BI and Microsoft Fabric environments. Provide ongoing support, troubleshooting, and training for Power BI users within the organization. Stay up-to-date with the latest updates, features, and best practices related to Microsoft Fabric and Power BI, and ensure these are incorporated into operations and standards. Lead and contribute to the development and execution of strategies for the enhancement of Power BI capabilities, ensuring consistency in user experience and report quality. Define and document technical standards, best practices, and data governance policies for Power BI, Microsoft Fabric, and Microsoft PowerApps. Create, prioritize, and maintain the backlog for standards development, pocs and optimization work Identify techniques to match to user feature needs Develop POCs for new Microsoft feature releases. Maintain PowerBI 101 and higher level training materials and deliver training to various communities Conduct market research and competitive analysis to identify opportunities and inform product strategy Analyze customer feedback and support data to identify pain points and opportunities for product improvement Basic Qualifications and Experience: Masters degree with 4 to 6years of experience in Product Owner / Platform Owner / Service Owner OR Bachelors degree with 6 to 8years of experience in Product Owner / Platform Owner / Service Owner OR Functional Skills: Must-Have Skills : Strong knowledge of Agile methodologies and product management principles, and backlog and feature management tools including Jira and Confluence Strong knowledge of IT service management (ITSM) principles and methodologies Strong experience in Microsoft Fabric, including data integration, data management, and analytics solutions. Expertise in Power Query, DAX, and Power BI Desktop for data transformation, modeling, and visualization. Solid understanding of data governance, security, and compliance in Power BI and Microsoft Fabric environments. Experience with Power BI Service and Power BI Report Server for report publishing, sharing, and management. Proven experience optimizing performance and troubleshooting Power BI reports and dashboards. Ability to work in a collaborative, fast-paced environment with multiple teams. Strong communication skills to engage with both technical and non-technical stakeholders Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Experience with Microsoft Azure and cloud-based analytics solutions. Knowledge of SQL, data warehousing, and ETL processes. Experience with Power Automate and Power Apps integrations with Power BI. SAFe Agile Practitioner (6.0) Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity, particularly about data patterns, and learning about business processes and life of the user Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation of varied audiences through complex technical/business topics Confidence in leading teams through prioritization and sequencing discussions, including managing stakeholder expectations Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources
Posted 2 weeks ago
10.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
OVERALL PURPOSE OF THE ROLE: The purpose of this role to build inhouse technical expertise for data integration area and delivery of data integration services for the platform. Primary Goals and Objectives- This role should be responsible for delivery model for Data Integration services. Person should be responsible for building technical expertise on data integration solutions and providing data integration services. The role is viewed as an expert in solution design, development, performance tuning and troubleshooting for data integration. RESPONSIBILITIES: Technical - Hands-on-experience architecting and delivering solutions related to enterprise integration, APIs, service-oriented architecture, and technology modernizations 3-4 years hands-on experience with the design, and implementation of integrations in the area of Dell Boomi Understanding the Business requirements and Functional requirement Documents and Design a Technical Solution as per the needs Person should be good with Master Data Management, Migration and Governance best practices Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities Lead and build data migration objects as needed for conversions of data from different sources Should have architected integration solutions using Dell Boomi for cloud, hybrid and on-premise integration landscapes Ability to build and architect a high performing, highly available, highly scale Boomi Molecule Infrastructure In depth understanding of enterprise integration patterns and prowess to apply them in the customers IT landscape Assists project teams during system design to promote the efficient re-use of IT assets Advises project team during system development to assure compliance with architectural principles, guidelines and standards Adept in building the Boomi processes with Error handling and email alerts logging best practices Should be proficient in using Enterprise level and Database connectors Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities Excellent understanding on REST with in-depth understanding on how Boomi processes can expose consume services using the different http methods, URI and Media type Understand Atom, Molecule, Atmosphere Configuration and Management, Platform Monitoring, Performance Optimization Suggestions, Platform Extension, User Permissions Control Skills. Knowledge on API governance and skills like caching, DB management and data warehousing Should have hands on experience in configuring AS2, https, SFTP involving different authentication methods Thorough knowledge on process deployment, applying extensions, setting up schedules, Web Services user management process filtering and process reporting Should be expert with XML and JSON activities like creation, mapping and migrations Person should have worked on integration on SAP, SuccessFactors, Sharepoint, cloud-based apps, Web applications and engineering application Support and resolve issues related to data integration deliveries or platform Project Management Person should deliver Data Integration projects using data integration platform Manage partner deliveries by setting up governance of their deliveries Understand project priorities, timelines, budget, and deliverables and the need to proactively push yourself and others to achieve project goals Managerial: Person is individual contributor and operationally managing small technical team Qualifications & Skills: 10+ years of experience in the area of enterprise integrations Minimum 3-4 years of experience with Dell boomi Should have working experience with database like sql server, Data warehousing Hands on experience on REST, SOAP, XML, JSON, SFTP, EDI Should have worked on integration of multiple technologies like SAP, Web, cloud based apps. EDUCATION: B.E. BEHAVIORAL COMPETENCIES: Demonstrate excellent collaboration skills as person will be interacting with multiple business units, Solution managers and internal IT teams Should have excellent analytical and problem solving skills Coaches, supports and trains other team membres You demonstrate excellent communication skills TECHNICAL COMPETENCIES & EXPERIENCE Technical expertise in Delll Boomi for data integration is MUST. Language Skills: English IT Skills: Dell Boomi, SQL, REST APIs, EDI, JSON, XML Location for the role? Travel? If yes, how much (%) - Bangalore. 5%. Contract Type/ Bonus (OPTIONAL)
Posted 2 weeks ago
7.0 - 9.0 years
15 - 30 Lacs
Hyderabad
Hybrid
PepsiCo -Sales Operations
Posted 2 weeks ago
5.0 - 10.0 years
10 - 12 Lacs
Noida
Work from Office
Job Title: Data Warehouse Developer II Location: Noida Department: IT Reports To: IT Supervisor/Manager/Director Direct Reports: No Job Summary The Data Warehouse Developer is responsible for designing, developing, maintaining, and supporting data transformation, integration, and analytics solutions across both cloud and on-premises environments. This role also provides 24x7 support for global systems. Key Responsibilities Understand and translate business requirements into technical solutions. Develop, test, debug, document, and implement ETL processes. Ensure performance, scalability, reliability, and security of solutions. Work with structured and semi-structured data across multiple platforms. Participate in Agile practices, including daily SCRUM meetings. Collaborate with infrastructure teams, DBAs, and software developers. Adhere to corporate standards for databases, data engineering, and analytics. Provide accurate time estimates, communicate status, and flag risks. Work across the full SDLC (analysis to support) using Agile methodologies. Demonstrate motivation, self-drive, and strong communication skills. Perform other related duties as assigned. Requirements Education & Experience Bachelors degree or equivalent work experience. 5+ years in software development/data engineering roles. At least 2 years of dedicated data engineering experience preferred. Technical Skills Strong experience with data transformations and manipulation. Ability to design data stores for analytics and other needs. Familiarity with traditional and modern data architectures (e.g., data lakes). Hands-on experience with cloud-native data tools (Azure preferred; GCP is a plus). Proficiency in traditional Microsoft ETL tools: SSIS, SSRS, SSAS, Power BI. Experience with Azure Data Factory is a plus. Soft Skills Ability to present and document clearly. Self-motivated and independent. Strong partnership and credibility with stakeholders. Work Environment Standard office setting. Use of standard office equipment.
Posted 2 weeks ago
5.0 - 7.0 years
32 - 40 Lacs
Bengaluru
Work from Office
Design, develop, and optimize large-scale data processing pipelines using PySpark. Work with various Apache tools and frameworks (like Hadoop, Hive, HDFS, etc.) to ingest, transform, and manage large datasets.
Posted 2 weeks ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
What you will do Role Description: The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing data pipelines, supporting and executing back-end web development, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in the design and development of data pipelines used for reports and/or back-end web application development Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of data modeling, data warehousing, and data integration concepts Strong understanding of AWS services Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and best practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experienced with SQL/NOSQL database, vector database for large language models Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Professional Certifications: Certified Data Engineer (preferred on Databricks or cloud environments) Certified SAFe Agilist (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills.
Posted 2 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Experience - 5+Years Job TitlePlatform Administrator / Data Platform Administrator Job Summary: We are seeking a highly skilled Platform Administrator with expertise in cloud solutions, data platform management, and data pipeline orchestration. The ideal candidate will possess a strong background in AWS architecture, experience with tools like Airflow, Informatica IDMC, and Snowflake, and a proven ability to design, implement, and maintain robust data platforms that support scalable, secure, and cost-effective operations. Key Responsibilities: Administer and manage cloud-based data platforms, ensuring high availability, scalability, and performance optimization. Architect and implement solutions on AWS to support data integration, transformation, and analytics needs. Leverage AWS services (such as EC2, S3, Lambda, RDS, and Redshift) to design and deploy secure and efficient cloud infrastructures. Manage and optimize data pipelines using Apache Airflow, ensuring reliable scheduling and orchestration of ETL processes. Oversee data integration and management processes using Informatica IDMC for data governance, quality, and privacy in the cloud. Administer and optimize Snowflake data warehousing solutions for querying, storage, and data analysis. Collaborate with cross-functional teams to ensure seamless data flow, system integration, and business intelligence capabilities. Ensure compliance with industry standards and best practices for cloud security, data privacy, and cost management. Required Qualifications: AWS Certified Solutions Architect - Professional or equivalent. Strong experience in platform administration and cloud architecture. Hands-on experience with Airflow, Informatica IDMC, and Snowflake. Proficiency in designing, deploying, and managing cloud-based solutions in AWS. Familiarity with data integration, ETL processes, and data governance best practices. Knowledge of scripting languages (Python, Shell, etc.) for automation and workflow management. Excellent troubleshooting, problem-solving, and performance optimization skills. Strong communication skills and the ability to collaborate with teams across technical and non-technical disciplines. Preferred Qualifications: Experience in multi-cloud environments. Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data visualization tools and business intelligence platforms.
Posted 2 weeks ago
4.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 4+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must be strong in SQL Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 2 weeks ago
6.0 - 11.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Analytics Technical Specialist Date 19 May 2025 Location: Bangalore, IN Company Alstom Req ID:486332 STRUCTURE, REPORTING, NETWORKS & LINKS: Organization Structure CITO |-- Data & AI Governance Vice President |-- Enterprise Data Domain Director |-- Head of Analytics Platform |-- Analytics Delivery Architect |-- Analytics Technical Specialist Organizational Reporting Reports to Delivery Manager Networks & Links Internally Transversal Digital Platforms Team. Innovation Team, Application Platform Owners, Business process owners, Infrastructure team Externally Third-party technology providers, Strategic Partners Location :Position will be based in Bangalore Willing to travel occasionally for onsite meetings and team workshops as required RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud, ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Translate business needs to technical specifications and Design, build and deploy solutions. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Optimize application performance and data models in Qlik Cloud while ensuring data accuracy and integrity. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment.Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribition to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Document Denodo processes, including data sources and transformations, to support knowledge sharing within the team. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelor s/Master s degree in Computer Science Engineering /Technology or related field Experience Minimum 3 and maximum 6 years of total experience Mandatory 2+ years of experience in Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Experience in MS Fabric Components along with Denodo. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in troubleshooting performance issues in MS Fabric and Denodo. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Understand Power BI application security layer model. Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Experience in developing data models using Denodo to support business intelligence and analytics needs. Proficient in creating base views and derived views for effective data representation. Ability to implement data transformations and enrichment within Denodo. Skilled in using Denodo's SQL capabilities to write complex queries for data retrieval. Familiarity with integrating Denodo with various data sources, such as databases, web services, and big data platforms. BEHAVIORAL COMPETENCIES The candidate should demonstrate: A strong sense for collaboration and being a team player Articulate issues and propose solutions. Structured thought process and articulation Critical thinking and problem-solving skills. Analytical bent of mind and be willing to question the status quo Possess excellent soft skills. Individual contributor and proactive and have leadership skills. Be able to guide and drive team from technical standpoint. Excellent written, verbal, and interpersonal skills. Self-motivated, quick learner is a must. Be fluent in English. Be able to influence and deliver. You don t need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, you ll be proud. If you re up for the challenge, we d love to hear from you! Important to note
Posted 2 weeks ago
6.0 - 11.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 6+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 2 weeks ago
1.0 - 4.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Build data integrations, data models to support analytical needs for this project. below. Translate business requirements into technical requirements as needed Design and develop automated scripts for data pipelines to process and transform as per the requirements and monitor those Produce artifacts such as data flow diagrams, designs, data model along with git code as deliverable Use tools or programming languages such as SQL, Python, Snowflake, Airflow, dbt, Salesforce Data cloud Ensure data accuracy, timeliness, and reliability throughout the pipeline. Complete QA, data profiling to ensure data is ready as per the requirements for UAT Collaborate with stakeholders on business, Visualization team and support enhancements Timely updates on the sprint boards, task updates Team lead to provide timely project updates on all the projects Project experience with version control systems and CICD such as GIT, GitFlow, Bitbucket, Jenkins etc. Participate in UAT to resolve findings and plan Go Live/Production deployment Milestones: Data Integration Plan into Data Cloud for structured and unstructured data/RAG needs for the Sales AI use cases Design Data Models and semantic layer on Salesforce AI Agentforce Prompt Integration Data Quality and sourcing enhancements Write Agentforce Prompts and refine as needed Assist decision scientist on the data needs Collaborate with EA team and participate in design review Performance Tuning and Optimization of Data Pipelines Hypercare after the deployment Project Review and Knowledge Transfer
Posted 2 weeks ago
5.0 - 8.0 years
15 - 30 Lacs
Pune
Hybrid
Skills- Data Engineer, Azure Data Factory (ADF), SQL, Power BI, SSRS, SSIS, SSAS, ETL, Data Bricks, Data Integration, Data Model
Posted 2 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Requirement : Senior Business Analyst (Data Application & Integration)Experience: 10+ Years Location: Gurgaon (Hybrid) Budget Max:35 LPA Preferred: Immediate Joiners Job Summary: We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills
Posted 2 weeks ago
10.0 - 12.0 years
8 - 11 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_1663_JOB Date Opened 17/12/2022 Industry Technology Job Type Work Experience 10-12 years Job Title Power BI Lead City Bangalore Province Karnataka Country India Postal Code 560002 Number of Positions 4 LocationBangalore, Chennai Develop PowerBI report in azure environment. Write/Manage basic SQL/PLSQL scripts for the report. Maintain and optimize the PowerBI gateway. PowerBI expert to propose improvements, best practices on the overall Azure PowerBI.(environment/setup, new/existing reports, best practices, support, monitoring etc.). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France