Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 10 Lacs
Chennai
Work from Office
Job Summary: We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities: Data Integration & ETL Development o Develop data ingestion, transformation, and deduplication pipelines. o Standardize, cleanse, and validate large-scale customer data. o Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration o Implement unification & deduplication logic for a single customer view. o Develop and manage data masking & encryption for security compliance. API & CI360 Integration o Integrate CMDM with SAS CI360 for seamless campaign execution. o Ensure API connectivity and data flow across platforms. Testing & Deployment o Conduct Unit, Integration, and UAT Testing. o Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required: SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques.
Posted 2 weeks ago
3.0 - 7.0 years
3 - 5 Lacs
Bengaluru
Hybrid
Role & responsibilities The MDM Analyst / Data Steward works closely with business stakeholders to understand and gather data requirements, develop data models and database designs, and define and implement data standards, policies, and procedures. This role also implements any rules inside of the MDM tool to improve the data, performs deduplication projects to develop golden records, and overall works towards improving the quality of data in the domain assigned. Required skills : Technical Skills: Proficiency in MDM tools and technologies such as Informatica MDM, CluedIn, or similar platforms is essential. Familiarity with data modeling, data integration, and data quality control techniques is also important. Experience with data governance platforms like Collibra and Alation can be beneficial1. Analytical Skills: Strong analytical and problem-solving skills are crucial for interpreting and working with large volumes of data. The ability to translate complex business requirements into practical MDM solutions is also necessary. Data Management: Experience in designing, implementing, and maintaining master data management systems and solutions. This includes conducting data cleansing, data auditing, and data validation activities. Communication and Collaboration: Excellent communication and interpersonal skills to effectively collaborate with business stakeholders, IT teams, and other departments. Data Governance: In-depth knowledge of data governance, data quality, and data integration principles. The ability to develop and implement data management processes and policies is essential. Educational Background: A Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field is typically required1. Certifications: Certification in the MDM domain (e.g., Certified MDM Professional) can be a plus Key Skills: Become the expert at the assigned domain of data Understand all source systems feeding into the MDM Write documentation of stewardship for the domain Develop rules and standards for the domain of data Generate measures of improvement to demonstrate to the business the quality of the data We are seeking candidates who can join immediately or within a maximum of 30 days' notice. Minimum of 3+ years of relevant experience is required. Candidates who are willing to relocate to Bangalore or are already based in Bangalore. Candidates should be flexible with working UK/US shifts.
Posted 2 weeks ago
1.0 - 6.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Role Description: As part of the cybersecurity organization, In this vital role you will be responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The role sits at the intersection of data infrastructure and business insight delivery, requiring the Data Engineer to design and build robust data pipelines while also translating data into meaningful visualizations for stakeholders across the organization. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Develop and maintain interactive dashboards and reports using tools like Tableau, ensuring data accuracy and usability Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures. Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. Collaborate with data scientists to develop pipelines that meet dynamic business needs. Share and discuss findings with team members practicing SAFe Agile delivery model. Basic Qualifications: Masters degree and 1 to 3 years of experience of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, GitLab, LucidChart, etc. Hands-on experience with data visualization and dashboarding toolsTableau, Power BI, or similar is a plus Proficiency in data analysis tools (e.g. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data governance frameworks, tools, and best practices Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, cloud data platforms Experience working in Product team's environment Experience working in an Agile environment Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements, and estimating efforts Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals
Posted 2 weeks ago
3.0 - 6.0 years
4 - 7 Lacs
Hyderabad
Work from Office
The Data Steward will play a critical role in ensuring data integrity, quality, and governance within SAP systems. The responsibilities include: Data Governance: o Define ownership and accountability for critical data assets to ensure they are effectively managed and maintain integrity throughout systems. o Collaborate with business and IT teams to enforce data governance policies, ensuring alignment with enterprise data standards. Data Quality Management: o Promote data accuracy and adherence to defined data management and governance practices. o Identify and resolve data discrepancies to enhance operational efficiency. Data Integration and Maintenance: o Manage and maintain master data quality for Finance and Material domains within the SAP system. o Support SAP data migrations, validations, and audits to ensure seamless data integration. Compliance and Reporting: o Ensure compliance with regulatory and company data standards. o Develop and distribute recommendations and supporting documentation for new or proposed data standards, business rules, and policies.
Posted 2 weeks ago
8.0 - 13.0 years
30 - 40 Lacs
Hyderabad
Work from Office
Looking for someone to work on ARCS Implementation plus support project for an elite customer into Digital Sports platform industry who would be configuring, setting up and implementing the end to end module along with providing post implementation support. Requirement: Oracle ARCS: Should have experience of at least 1 end to end ARCS implementation. Proficient in setting up ARCS Formats and Rules. Month end close & reconciliation process knowledge. Proficient in Excel Knowledge of ARCS Transaction Matching Build automation using EPM automate. Knowledge of SQL & building ARCS custom reports Candidate must have hands-on experience with the functional and operational aspects of ARCS application design, development of various application artifacts. Proficient in DataExchange / FDMee / Data Integration & EPMi
Posted 2 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will develop an insight driven sensing capability with a focus on revolutionizing decision making. In this role you will lead the technical delivery for this capability as part of a team data engineers and software engineers. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery while building this ground-breaking new capability for Amgen. You will drive the software engineering side of the product release and will deliver for the outcomes. Roles & Responsibilities: Lead delivery of overall product and product features from concept to end of life management of the product team comprising of technical engineers, product owners and data scientists to ensure that business, quality, and functional goals are met with each product release Drives excellence and quality for the respective product releases, collaborating with Partner teams. Impacts quality, efficiency and effectiveness of own team. Has significant input into priorities. Incorporate and prioritize feature requests into product roadmap; Able to translate roadmap into execution Design and implement usability, quality, and delivery of a product or feature Plan releases and upgrades with no impacts to business Hands on expertise in driving quality and best in class Agile engineering practices Encourage and motivate the product team to deliver innovative and exciting solutions with an appropriate sense of urgency Manages progress of work and addresses production issues during sprints Communication with partners to make sure goals are clear and the vision is aligned with business objectives Direct management and staff development of team members What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 8 to 10 years of Information Systems experience OR Bachelors degree and 10 to 14 years ofInformation Systems experience OR Diploma and 14 to 18 years of Information Systems experience Thorough understanding of modern web application development and delivery, Gen AI applications development, Data integration and enterprise data fabric concepts, methodologies, and technologies e.g. AWS technologies, Databricks Demonstrated experience in building strong teams with consistent practices. Demonstrated experience in navigating matrix organization and leading change. Prior experience writing business case documents and securing funding for product team delivery; Financial/Spend management for small to medium product teams is a plus. In-depth knowledge of Agile process and principles. Define success metrics for developer productivity metrics; on a monthly/quarterly basis analyze how the product team is performing against established KPIs. Functional Skills: Leadership: Influences through Collaboration : Builds direct and behind-the-scenes support for ideas by working collaboratively with others. Strategic Thinking : Anticipates downstream consequences and tailors influencing strategies to achieve positive outcomes. Transparent Decision-Making : Clearly articulates the rationale behind decisions and their potential implications, continuously reflecting on successes and failures to enhance performance and decision-making. Adaptive Leadership : Recognizes the need for change and actively participates in technical strategy planning. Preferred Qualifications: Strong influencing skills, influence stakeholders and be able to balance priorities. Prior experience in vendor management. Prior hands-on experience leading full stack development using infrastructure cloud services (AWS preferred) and cloud-native tools and design patterns (Containers, Serverless, Docker, etc.) Experience with developing solutions on AWS technologies such as S3, EMR, Spark, Athena, Redshift and others Familiarity with cloud security (AWS /Azure/ GCP) Conceptual understanding of DevOps tools (Ansible/ Chef / Puppet / Docker /Jenkins) Professional Certifications AWS Certified Solutions Architect (preferred) Certified DevOps Engineer (preferred) Certified Agile Leader or similar (preferred) Soft Skills: Strong desire for continuous learning to pick new tools/technologies. High attention to detail is essential with critical thinking ability. Should be an active contributor on technological communities/forums Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Influences and energizes others toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable Established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Excellent organizational and time-management skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Education and Professional Certifications Masters degree and 3 to 4 + years of Computer Science, IT or related field experience OR Bachelors degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 weeks ago
3.0 - 5.0 years
5 - 10 Lacs
Chennai
Work from Office
Job Summary: We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities: Data Integration & ETL Development o Develop data ingestion, transformation, and deduplication pipelines. o Standardize, cleanse, and validate large-scale customer data. o Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration o Implement unification & deduplication logic for a single customer view. o Develop and manage data masking & encryption for security compliance. API & CI360 Integration o Integrate CMDM with SAS CI360 for seamless campaign execution. o Ensure API connectivity and data flow across platforms. Testing & Deployment o Conduct Unit, Integration, and UAT Testing. o Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required: SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques
Posted 2 weeks ago
4.0 - 6.0 years
14 - 24 Lacs
Hyderabad
Hybrid
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL . Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 46 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure 1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Database Architecting.
Posted 2 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Chennai
Work from Office
About Company Agilysys is well known for its long heritage of hospitality-focused technology innovation. The Company delivers modular and integrated software solutions and expertise to businesses seeking to maximize Return on Experience (ROE) through hospitality encounters that are both personal and profitable. Over time, customers achieve High Return Hospitality by consistently delighting guests, retaining staff and growing margins. Customers around the world include branded and independent hotels; multi-amenity resort properties; casinos; property, hotel and resort management companies; cruise lines; corporate dining providers; higher education campus dining providers; food service management companies; hospitals; lifestyle communities; senior living facilities; stadiums; and theme parks. The Agilysys Hospitality Cloud™ combines core operational systems for property management (PMS), point-of-sale (POS) and Inventory and Procurement (I&P) with Experience Enhancers™ that meaningfully improve interactions for guests and for employees across dimensions such as digital access, mobile convenience, self-service control, personal choice, payment options, service coverage and real-time insights to improve decisions. Core solutions and Experience Enhancers are selectively combined in Hospitality Solution Studios™ tailored to specific hospitality settings and business needs. Agilysys operates across the Americas, Europe, the Middle East, Africa, Asia-Pacific, and India with headquarters located in Alpharetta, GA. For more information visitAgilysys.com. visit Agilysys.com. Requirement & Responsibilty : Proficiency in MongoDB Data Modeling Strong experience with MongoDB Query & Index Tuning Experience with MongoDB Sharding & Replication Troubleshooting MongoDB bottlenecks State-of-the-art MongoDB performance tuning capabilities Respond to incidents and ability to bring them to closure Ensure that the databases achieve maximum performance and availability Recommend and implement best practice Passion for troubleshooting the toughest problems and propose creative solutions Desired Experience : Hospitality Experience.
Posted 2 weeks ago
5.0 - 10.0 years
3 - 8 Lacs
Navi Mumbai
Work from Office
Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Senior SAS Programmers to join our Biostatistics team in India, Mumbai. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Propose and develop specifications for new projects and serve as a project team leader Write SAS programs for use in creating analysis datasets, tables, listings, and figures Using SAS, program, validate and maintain mapped database Program edit checks for external data Responsible for the setup, validation and maintenance of mapped databases, integration of external data with associated edit checks, writing programs independently with good quality for use in creating analysis datasets, tables, listings, and figures. Responsible for mapped database setup, validation and maintenance, and external data integration & edit checks, validation, and maintenance Qualifications Bachelor / Master’s Degree in math, Statistics, health informatics, data science, computer science, or life sciences field 5+ years' eperience with SAS Excellent knowledge of CDISC standards SAS Certification Thorough understanding of the pharmaceutical industry and Federal Regulations regarding electronic records Excellent analytical, written and oral communication skills Good English written/communication skills is required People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
Posted 2 weeks ago
1.0 - 3.0 years
4 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in the design and development of the data pipeline. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks. Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience Masters degree and 1 to 3 years of experience in Computer Science, IT, or related field OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT, or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT, or related field Must-Have Skills: Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing. Proficiency in data analysis tools (e.g., SQL) and experience with data visualization tools. Excellent problem-solving skills and the ability to work with large, complex datasets. Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development. Strong understanding of data modeling, data warehousing, and data integration concepts. Knowledge of Python/R, Databricks, SageMaker, cloud data platforms. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Certified Data Scientist (preferred on Databricks or Cloud environments). Machine Learning Certification (preferred on Databricks or Cloud environments). Soft Skills: Excellent critical-thinking and problem-solving skills. Strong communication and collaboration skills. Demonstrated awareness of how to function in a team setting. Demonstrated presentation skills.
Posted 2 weeks ago
9.0 - 14.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
Our client is Leading global IT Services and Consulting Organization Job Description: Shift Timings : 12:00 noon to 9:30 PM IST Location : HYD & Pune only Role :Architect Key skills required for the job are: Talend DI (Mandatory) and having good exposure to RDBMS databases like Oracle, Sql server. • 3+ years of experience in implementation of ETL projects in a large-scale enterprise data warehouse environment and at least one Successful implementation Talend with DWH is must. • As a Senior Developer, candidate is responsible for development, support, maintenance, and implementation of a complex project module. Candidate is expected to have depth of knowledge of specified technological area, which includes knowledge of applicable processes, methodologies, standards, products and frameworks. • She/he should have experience in application of standard software development principles using Talend. • She/he should be able to work as an independent team member, capable of applying judgment to plan and execute HWB tasks. • Build reusable Talend jobs, routines, and components to support data integration, quality and transformations.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Description: Amgen is seeking a Sr Associate HR Data Analysis (Visier Admin). The Sr Associate HR Data Analysis (Visier Admin) will report to the Associate Director HR Technology. The successful incumbent will have previous Visier reporting tool Admin experience. Roles & Responsibilities: Hands on experience supporting Visier Previous experience with Vee Administrative tasks associated with Visier such as role assignments and creating roles Visier Security configuration, data integration, and data exports Ability to analyze, troubleshoot and resolve Visier data issues Must have previous experience handling large datasets and sensitive HR data Basic Qualifications and Experience: 5 years minimum experience in human resources with hands on experience with Visier Masters degree, OR Bachelors degree and 5 years of HR IS experience Functional Skills: Must-Have Skills: Strong working knowledge of Visier 5+ years experience in human resources and corporate service center supporting Workday Soft Skills: Excellent analytical and troubleshooting skills Strong quantitative, analytical (technical and business), problem solving skills, and attention to detail Strong verbal, written communication and presentation skills Ability to work effectively with global, virtual teams Strong technical acumen, logic, judgement and decision-making Strong initiative and desire to learn and grow Ability to manage multiple priorities successfully Exemplary adherence to ethics, data privacy and compliance policies
Posted 2 weeks ago
8.0 - 10.0 years
5 - 8 Lacs
Vadodara
Work from Office
Job Description: SharePoint Power BI Developer, Able to lead the Power BI new and ongoing Projects. Upgrading, configuring, & debugging existing Power BI implementation End to end experience on data modelling & integration using power BI. Ability to analyse data, data mining, developing tools to transform da-ta, developing data warehouses, working on data models as per business requirement. Leverage available best technology to the business benefits. Qualification: BE/BTech/BCA/MCA or equivalent
Posted 2 weeks ago
4.0 - 8.0 years
14 - 19 Lacs
Chennai, Gurugram, Bengaluru
Work from Office
We are currently seeking a Salesforce Data Cloud Architect to join our team in "‹"‹"‹"‹"‹"‹"‹Hyderabad, Telangana, India. Salesforce Data Cloud Expertise: Extensive knowledge of Salesforce Data Cloud features, capabilities, and best practices. Data Modeling: Strong experience in designing and implementing data models. Data Integration: Experience with data integration tools and techniques. Data Quality: Understanding of data quality concepts and practices. Data Governance: Knowledge of data governance principles and practices. SQL: Proficiency in SQL for data querying and manipulation. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent communication and collaboration skills. Location - Bengaluru,Chennai,Gurugram,Hyderabad,Noida,Pune
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Pune
Work from Office
Req ID: 324609 We are currently seeking a Data Engineer to join our team in Pune, Mahrshtra (IN-MH), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%." Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred
Posted 2 weeks ago
5.0 - 10.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Req ID: 326727 We are currently seeking a Microsoft Fabric Specialist to join our team in Hyderabad, Telangana (IN-TG), India (IN). : We are seeking a Mid-Level Microsoft Fabric Support Specialist to join our IT team. The ideal candidate will be responsible for providing technical support, troubleshooting, and ensuring the smooth operation of Microsoft Fabric services. This role requires a deep understanding of Microsoft Fabric, data integration, and analytics solutions, along with strong problem-solving skills. Key Responsibilities: "¢ Provide technical support and troubleshooting for Microsoft Fabric services. "¢ Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. "¢ Monitor system performance and resolve issues proactively. "¢ Collaborate with cross-functional teams to optimize data workflows and analytics solutions. "¢ Document support procedures, best practices, and troubleshooting steps. "¢ Assist in user training and onboarding for Microsoft Fabric-related tools and applications. "¢ Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: "¢ 5+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. "¢ Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. "¢ Experience with troubleshooting and resolving issues in a cloud-based environment. "¢ Familiarity with SQL, data pipelines, and ETL processes. "¢ Excellent problem-solving and communication skills. "¢ Ability to work independently and collaboratively in a team environment. Preferred Qualifications: "¢ Microsoft certifications related to Fabric, Azure, or Power BI. "¢ Experience with automation and scripting (PowerShell, Python, etc.). "¢ Understanding of security and compliance considerations in cloud-based data platforms.
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Pune
Work from Office
Req ID: 324653 We are currently seeking a Data Engineer to join our team in Pune, Mahrshtra (IN-MH), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%." Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred
Posted 2 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 319341 We are currently seeking a MS Fabric Architect - Support to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesKey Responsibilities: Provide technical support and troubleshooting for Microsoft Fabric services. Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. Monitor system performance and resolve issues proactively. Collaborate with cross-functional teams to optimize data workflows and analytics solutions. Document support procedures, best practices, and troubleshooting steps. Assist in user training and onboarding for Microsoft Fabric-related tools and applications. Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: 6+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. Experience with troubleshooting and resolving issues in a cloud-based environment. Familiarity with SQL, data pipelines, and ETL processes. Excellent problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Preferred Qualifications: Microsoft certifications related to Fabric, Azure, or Power BI. Experience with automation and scripting (PowerShell, Python, etc.). Understanding of security and compliance considerations in cloud-based data platforms. Minimum Skills RequiredKey Responsibilities: Provide technical support and troubleshooting for Microsoft Fabric services. Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. Monitor system performance and resolve issues proactively. Collaborate with cross-functional teams to optimize data workflows and analytics solutions. Document support procedures, best practices, and troubleshooting steps. Assist in user training and onboarding for Microsoft Fabric-related tools and applications. Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: 6+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. Experience with troubleshooting and resolving issues in a cloud-based environment. Familiarity with SQL, data pipelines, and ETL processes. Excellent problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Preferred Qualifications: Microsoft certifications related to Fabric, Azure, or Power BI. Experience with automation and scripting (PowerShell, Python, etc.). Understanding of security and compliance considerations in cloud-based data platforms."
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Chennai
Work from Office
Req ID: 324631 We are currently seeking a Data Engineer to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%." Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Chennai
Work from Office
Req ID: 324632 We are currently seeking a Data Engineer to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%." Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred
Posted 2 weeks ago
2.0 - 6.0 years
1 - 5 Lacs
Noida
Work from Office
Req ID: 324014 We are currently seeking a Tableau Admin with AWS Experience to join our team in NOIDA, Uttar Pradesh (IN-UP), India (IN). Tableau Admin with AWS Experience We are seeking a skilled Tableau Administrator with experience in AWS to join our team. The ideal candidate will be responsible for managing and optimizing our Tableau Server environment hosted on AWS, ensuring efficient operation, data security, and seamless integration with other data sources and analytics tools. Key Responsibilities - Manage, configure, and administer Tableau Server on AWS, including setting up sites and managing user access and permissions. - Monitor server activity/performance, conduct regular system maintenance, and troubleshoot issues to ensure optimal performance and minimal downtime. - Collaborate with data engineers and analysts to optimize data sources and dashboard performance. - Implement and manage security protocols, ensuring compliance with data governance and privacy policies. - Automate monitoring and server management tasks using AWS and Tableau APIs. - Assist in the design and development of complex Tableau dashboards. Provide technical support and training to Tableau users. - Stay updated on the latest Tableau and AWS features and best practices, recommending and implementing improvements. Qualifications - - Proven experience as a Tableau Administrator, with strong skills in Tableau Server and Tableau Desktop. - Experience with AWS, particularly with services relevant to hosting and managing Tableau Server (e.g., EC2, S3, RDS). - Familiarity with SQL and experience working with various databases. Knowledge of data integration, ETL processes, and data warehousing principles. - Strong problem-solving skills and the ability to work in a fast-paced environment. - Excellent communication and collaboration skills. - Relevant certifications in Tableau and AWS are a plus. A Tableau Administrator, also known as a Tableau Server Administrator, is responsible for managing and maintaining Tableau Server, a platform that enables organizations to create, share, and collaborate on data visualizations and dashboards. Here's a typical job description for a Tableau Admin 1. Server Administration Install, configure, and maintain Tableau Server to ensure its reliability, performance, and security. 2. User Management Manage user accounts, roles, and permissions on Tableau Server, ensuring appropriate access control. 3. Security Implement security measures, including authentication, encryption, and access controls, to protect sensitive data and dashboards. 4. Data Source Connections Set up and manage connections to various data sources, databases, and data warehouses for data extraction. 5. L icense Management: Monitor Tableau licensing, allocate licenses as needed, and ensure compliance with licensing agreements. 6. Backup and Recovery Establish backup and disaster recovery plans to safeguard Tableau Server data and configurations. 7. Performance Optimization Monitor server performance, identify bottlenecks, and optimize configurations to ensure smooth dashboard loading and efficient data processing. 8. Scaling Scale Tableau Server resources to accommodate increasing user demand and data volume. 9. Troubleshooting Diagnose and resolve issues related to Tableau Server, data sources, and dashboards. 10. Version Upgrades Plan and execute server upgrades, apply patches, and stay current with Tableau releases. 11. Monitoring and Logging Set up monitoring tools and logs to track server health, user activity, and performance metrics. 12. Training and Support Provide training and support to Tableau users, helping them with dashboard development and troubleshooting. 13. Collaboration Collaborate with data analysts, data scientists, and business users to understand their requirements and assist with dashboard development. 14. Documentation Maintain documentation for server configurations, procedures, and best practices. 15. Governance Implement data governance policies and practices to maintain data quality and consistency across Tableau dashboards. 16. Integration Collaborate with IT teams to integrate Tableau with other data management systems and tools. 17. Usage Analytics Generate reports and insights on Tableau usage and adoption to inform decision-making. 18. Stay Current Keep up-to-date with Tableau updates, new features, and best practices in server administration. A Tableau Administrator plays a vital role in ensuring that Tableau is effectively utilized within an organization, allowing users to harness the power of data visualization and analytics for informed decision-making.
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Chennai
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Google BigQuery.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France