Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP CPI for Data Services Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve collaborating with teams to create innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing cutting-edge technologies- Drive continuous improvement initiatives within the team Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP CPI for Data Services- Strong understanding of data integration and transformation processes- Experience in developing and implementing data migration strategies- Hands-on experience with SAP Cloud Platform Integration tools- Knowledge of SAP ERP systems integration- Good To Have Skills: Experience with SAP Cloud Platform Integration for process integration Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP CPI for Data Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration processes and methodologies.- Experience with data quality management and data governance practices.- Familiarity with database management systems and data modeling techniques.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 2 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
8.0 - 10.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Job description: Job Title: Tech Lead Trading Risk & Compliance Systems Department: Technology Services / Market Surveillance Reports To: Head of Regulatory & Digital Experience We are seeking a highly capable and proactive Senior Technical Lead Market Surveillance to drive the successful integration, onboarding, and ongoing operational support of our market surveillance platform. This role is essential in ensuring seamless collaboration with exchanges and trading venues, coordinating data integrations, and managing the performance and change lifecycle of the surveillance solution. You will be the technical focal point between internal teams, the platform vendor, and external counterparties to ensure the platform operates reliably and meets evolving business and regulatory needs. Key Responsibilities: Platform Integration & Exchange Onboarding: Lead technical efforts to integrate new exchanges, venues, and data sources into the surveillance platform, coordinating across vendor, internal infrastructure, and external exchange teams. Facilitate onboarding of new data feeds (e.g., order, trade, drop copy, market data) including format validation, protocol mapping (e.g., FIX), normalization, and certification. Manage and support testing cycles with exchanges and venues, including coordination of system integration testing (SIT), user acceptance testing (UAT), and issue triage. Vendor Coordination & Change Management: Act as the primary technical point of contact with the market surveillance provider, ensuring the vendor delivers against SLAs and functional requirements. Oversee release coordination, environment management (e.g., test, UAT, prod), and feature deployments in partnership with the vendor and internal teams. Drive change request processes, track enhancement requests, and validate rule and configuration changes prior to production deployment. Ensure the SaaS platform continues to meet operational, regulatory, and security expectations. Operational Oversight: Monitor platform availability, data completeness, and alert performance to ensure continuous surveillance coverage. Support incident triage and resolution in collaboration with the SaaS vendor, DevOps, and compliance stakeholders. Establish and maintain dashboards, operational runbooks, and incident response protocols. Drive regular vendor review sessions to review KPIs, issues, and platform roadmap alignment. Cross-Functional Leadership: Partner with Compliance, Surveillance Operations, Legal, and IT Security to ensure surveillance capabilities align with regulatory mandates and internal controls. Represent technical interests in vendor governance meetings, regulatory audits, and internal control assessments. Translate compliance requirements into actionable technical tasks and track their delivery with vendor and internal teams. Required Qualifications: Bachelors degree in computer science, Information Systems, Engineering, or related field. 8+ years of experience in financial services technology, including 3+ years in a technical leadership or systems integration role. Strong understanding of market structure, trade lifecycle, and regulatory surveillance needs across asset classes. Proven experience working with third-party SaaS or managed surveillance platforms and external data providers (e.g., exchanges, brokers). Familiarity with integration protocols such as FIX, SFTP, REST APIs, and data transformation/validation processes. Excellent organizational, communication, and stakeholder management skills. Preferred Skills: Exposure to major market surveillance platforms (e.g., Nasdaq SMARTS, Eventus Validus, Scila, ACA, etc.). Familiarity with regulatory frameworks such as CAT, MAR, MiFID II, and SEC Rule 15c3-5. Experience managing vendor relationships in a regulated environment. Working knowledge of monitoring and observability tools, and cloud/SaaS operations. Note: Must be willing to travel to Dubai.
Posted 1 week ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Required Skills: Need experienced Informatica MDM Engineers with strong ETL and data integration expertise to design, develop 5+ years of hands-on experience with Informatica PowerCenter or IICS. Exposure to data quality tools, MDM, or real-time data integration. Strong experience with ETL design and performance tuning. Proficiency in writing complex SQL queries and working with relational databases (Oracle, SQL Server, etc.). Solid understanding of data warehousing concepts, data modeling, and architecture. Familiarity with job scheduling tools and version control systems.
Posted 2 weeks ago
1.0 - 4.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Reporting Development and Data Integration Assist with data projects related to integration with our core claims adjudication engines, eligibility, and other database items as necessary Support the data leads by producing ad hoc reports as needed based on requirements from the business Report on key milestones to our project leads Ensuring all reporting aligns with brand standards Ensuring PADU guidelines for tools, connections, and data security Build a network with internal partners to assist with validating data quality Analytical Skills Utilization Applying analytical skills and developing business knowledge to support operations Identify automation opportunities through the trends and day to day tasks to help create efficiencies within the team Perform root cause analysis via the 5 why root causing to identify process gaps and initiate process improvement efforts Assist with user testing for reports, business insights dashboards, and assist with automation validation review Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Degree or equivalent data science, analysis, mathematics experience Experience supporting operational teams' performance with reports and analytics Experience using Word (creating templates/documents), PowerPoint (creation and presentation), Teams, and SharePoint (document access/storage, sharing, List development and management) Basic understanding of reporting using Business Insights tools including Tableau and PowerBI Expertise in Excel (data entry, sorting/filtering) and VBA Proven solid communication skills including oral, written, and organizational skills Proven ability to manage emotions effectively in high-pressure situations, maintaining composure, and fosters a positive work environment conducive to collaboration and productivity Preferred Qualifications: Experience leveraging and creating automation such as macros, PowerAutomate, Alteryx/ETL Applications Experience working with cloud-based servers, knowledge of database structure, stored procedures Experience performing root cause analysis and demonstrated problem solving skills Knowledge of R/Python, SQL, DAX or other coding languages Knowledge of multiple lines of business, benefit structures and claims processing systems
Posted 2 weeks ago
6.0 - 10.0 years
13 - 17 Lacs
Chennai
Work from Office
Overview Prodapt is looking for a Data Model Architect. The candidate should be good with design and data architecture in Telecom domain. Responsibilities Deliverables Design & Document – Data Model for CMDB, P-S-R Catalogue (Product, Service and Resource management layer) Design & Document Build Interface Speciation for Data Integration. Activities Data Architecture and Modeling Design and maintain conceptual, logical, and physical data models Ensure scalability and adaptability of data models for future organizational needs. Data Model P-S-R catalogs in the existing Catalogs,SOM,COM systems CMDB Design and Management Architect and optimize the CMDB to accurately reflect infrastructure components, telecom assets, and their relationships. Define data governance standards and enforce data consistency across the CMDB. Design data integrations between across systems (e.g., OSS/BSS, network monitoring tools, billing systems). Good Communication skills. Bachelors Degree.
Posted 2 weeks ago
4.0 - 8.0 years
13 - 18 Lacs
Noida
Work from Office
Position Summary To be a technology expert architecting solutions and mentoring people in BI / Reporting processes with prior expertise in the Pharma domain. Job Responsibilities Independently, he/she should be able to drive and deliver complex reporting and BI project assignments in PowerBI on AWS/Azure Cloud. Should be able to design and deliver across Power BI services, Power Query, DAX, and data modelling concepts. Should be able to write complex SQLs focusing on Data Aggregation and analytic calculations used in the reporting KPIs. Be able to analyse the data and understand the requirements directly from customer or from project teams across pharma commercial data sets Should be able to drive the team on the day-to-day tasks in alignment with the project plan and collaborate with team to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items in an onshore-offshore model. Able to think analytically, use a systematic and logical approach to analyse data, problems, and situations. Manage client communication and client expectations independently. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills . Education BE/B.Tech Master of Computer Application Work Experience Should have 4-8 years of working on experience in developing Power BI reports. Must have proficiency in Power BI services, Power Query, DAX, and data modelling concepts. Should have experience in design techniques such as UI designing and creating mock-ups/intuitive visualizations seamless user experience. Should have expertise in writing complex SQLs focusing on Data Aggregation and analytic calculations used for deriving the reporting KPIs. Strong understanding of data integration, ETL processes, data warehousing , preferably on AWS Redshift and/or Snowflake. Excellent problem-solving skills with the ability to troubleshoot and resolve technical issues. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Good to have experience in the Pharma Commercial data sets and related KPIs for Sale Performance, Managed Market, Customer 360, Patient Journey etc. Good to have experience and additional know-how on other reporting tools. Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Capability Building / Thought Leadership Power BI SQL Business Intelligence(BI) Snowflake
Posted 2 weeks ago
6.0 - 11.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Responsibilities Design and develop highly scalable web-based applications based on business needs. Design and customize software for client use with the aim of optimizing operational efficiency. A deep understanding of, and ability to use and explain all aspects of application integration in .NET and data integration with SQL Server and associated technologies and standards Strong background in building and operating SAAS platforms using the Microsoft technology stack with modern services-based architectures. Ability to recommend and configure Azure subscriptions and establish connectivity Work with IT teams to setup new application architecture requirements Coordinate releases with Quality Assurance Team and implement SDLC workflows and better source code integration. Implement build process and continuous build integration with Unit Testing framework. Develop and maintain a thorough understanding of business needs from both technical and business perspectives Assist and mentor junior team members to enforce development guidelines. Take technical ownership of products and provide support with quick turnaround. Effectively prioritize and execute tasks in a high-pressure environment Qualifications / Experience Bachelor\u2019s/master\u2019s degree in computer science / computer engineering Minimum of 6+ years\u2019 experience in building enterprise scale windows and web application using Microsoft .NET technologies. 5+ years of experience in C#, ASP.NET MVC and.Net Core Web API 1+ years of experience in Angular 2 or higher Experience in any of the following are also desirableBootstrap, Knockout, entity framework, nhibernate, Subversion, Linq, Asynchronous Module Definition (such as requirejs) In depth knowledge on design patterns and unit testing frameworks. Experience with Agile application development. SQL Server development, performance tuning (SQL Server 2014/2016) and troubleshooting Ability to work with a sense of urgency and attention to detail Excellent oral and written communication skills.
Posted 2 weeks ago
10.0 - 15.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Responsibilities Design and develop highly scalable web based applications based on business needs Design and customize software for client use with the aim of optimizing operational efficiency A deep understanding of, and ability to use and explain all aspects of application integration in .NET and data integration with SQL Server and associated technologies and standards Design and provide architecture solutions based on the business needs Strong background in building and operating SAAS platforms using the Microsoft technology stack with modern services based architectures Ability to recommend and configure Azure subscriptions and establish connectivity Work with IT teams to setup new application architecture requirements Coordinate releases with Quality Assurance Team and implement SDLC work flows and better source code integration Implement build process and continuous build integration with Unit Testing framework Develop and maintain a thorough understanding of business needs from both technical and business perspectives Assist and mentor junior team members to enforce development guidelines Take technical ownership of products and provide support with quick turnaround Effectively prioritize and execute tasks in a high-pressure environment Qualifications / Experience Bachelor\u2019s/Master\u2019s degree in Computer Science / Computer Engineering Minimum of 10+ years\u2019 experience in building enterprise scale windows and web application using Microsoft .NET technologies 5+ years of experience in C#, ASP.NET MVC and Microsoft Web API 1+ years of experience in Angular 2 or higher Experience in solution architecture in .Net technologies Experience in any of the following are also desirableBootstrap, Knockout, entity framework nhibernate, Subversion, Linq, Asynchronous Module Definition (such as requirejs) In depth knowledge on design patterns and unit testing frameworks Experience with Agile application development SQL server performance tuning (SQL Server 2014/2016) and troubleshooting Ability to work with a sense of urgency and attention to detail Excellent oral and written communication skills
Posted 2 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Business Analyst/ Data Analyst(Media). Experience: 3-5 Years.
Posted 2 weeks ago
3.0 - 5.0 years
5 - 10 Lacs
Chennai
Work from Office
Job Summary: We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities: Data Integration & ETL Development o Develop data ingestion, transformation, and deduplication pipelines. o Standardize, cleanse, and validate large-scale customer data. o Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration o Implement unification & deduplication logic for a single customer view. o Develop and manage data masking & encryption for security compliance. API & CI360 Integration o Integrate CMDM with SAS CI360 for seamless campaign execution. o Ensure API connectivity and data flow across platforms. Testing & Deployment o Conduct Unit, Integration, and UAT Testing. o Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required: SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques.
Posted 2 weeks ago
3.0 - 7.0 years
3 - 5 Lacs
Bengaluru
Hybrid
Role & responsibilities The MDM Analyst / Data Steward works closely with business stakeholders to understand and gather data requirements, develop data models and database designs, and define and implement data standards, policies, and procedures. This role also implements any rules inside of the MDM tool to improve the data, performs deduplication projects to develop golden records, and overall works towards improving the quality of data in the domain assigned. Required skills : Technical Skills: Proficiency in MDM tools and technologies such as Informatica MDM, CluedIn, or similar platforms is essential. Familiarity with data modeling, data integration, and data quality control techniques is also important. Experience with data governance platforms like Collibra and Alation can be beneficial1. Analytical Skills: Strong analytical and problem-solving skills are crucial for interpreting and working with large volumes of data. The ability to translate complex business requirements into practical MDM solutions is also necessary. Data Management: Experience in designing, implementing, and maintaining master data management systems and solutions. This includes conducting data cleansing, data auditing, and data validation activities. Communication and Collaboration: Excellent communication and interpersonal skills to effectively collaborate with business stakeholders, IT teams, and other departments. Data Governance: In-depth knowledge of data governance, data quality, and data integration principles. The ability to develop and implement data management processes and policies is essential. Educational Background: A Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field is typically required1. Certifications: Certification in the MDM domain (e.g., Certified MDM Professional) can be a plus Key Skills: Become the expert at the assigned domain of data Understand all source systems feeding into the MDM Write documentation of stewardship for the domain Develop rules and standards for the domain of data Generate measures of improvement to demonstrate to the business the quality of the data We are seeking candidates who can join immediately or within a maximum of 30 days' notice. Minimum of 3+ years of relevant experience is required. Candidates who are willing to relocate to Bangalore or are already based in Bangalore. Candidates should be flexible with working UK/US shifts.
Posted 2 weeks ago
1.0 - 6.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Role Description: As part of the cybersecurity organization, In this vital role you will be responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The role sits at the intersection of data infrastructure and business insight delivery, requiring the Data Engineer to design and build robust data pipelines while also translating data into meaningful visualizations for stakeholders across the organization. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Develop and maintain interactive dashboards and reports using tools like Tableau, ensuring data accuracy and usability Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures. Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. Collaborate with data scientists to develop pipelines that meet dynamic business needs. Share and discuss findings with team members practicing SAFe Agile delivery model. Basic Qualifications: Masters degree and 1 to 3 years of experience of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, GitLab, LucidChart, etc. Hands-on experience with data visualization and dashboarding toolsTableau, Power BI, or similar is a plus Proficiency in data analysis tools (e.g. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data governance frameworks, tools, and best practices Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, cloud data platforms Experience working in Product team's environment Experience working in an Agile environment Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements, and estimating efforts Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals
Posted 2 weeks ago
3.0 - 6.0 years
4 - 7 Lacs
Hyderabad
Work from Office
The Data Steward will play a critical role in ensuring data integrity, quality, and governance within SAP systems. The responsibilities include: Data Governance: o Define ownership and accountability for critical data assets to ensure they are effectively managed and maintain integrity throughout systems. o Collaborate with business and IT teams to enforce data governance policies, ensuring alignment with enterprise data standards. Data Quality Management: o Promote data accuracy and adherence to defined data management and governance practices. o Identify and resolve data discrepancies to enhance operational efficiency. Data Integration and Maintenance: o Manage and maintain master data quality for Finance and Material domains within the SAP system. o Support SAP data migrations, validations, and audits to ensure seamless data integration. Compliance and Reporting: o Ensure compliance with regulatory and company data standards. o Develop and distribute recommendations and supporting documentation for new or proposed data standards, business rules, and policies.
Posted 2 weeks ago
8.0 - 13.0 years
30 - 40 Lacs
Hyderabad
Work from Office
Looking for someone to work on ARCS Implementation plus support project for an elite customer into Digital Sports platform industry who would be configuring, setting up and implementing the end to end module along with providing post implementation support. Requirement: Oracle ARCS: Should have experience of at least 1 end to end ARCS implementation. Proficient in setting up ARCS Formats and Rules. Month end close & reconciliation process knowledge. Proficient in Excel Knowledge of ARCS Transaction Matching Build automation using EPM automate. Knowledge of SQL & building ARCS custom reports Candidate must have hands-on experience with the functional and operational aspects of ARCS application design, development of various application artifacts. Proficient in DataExchange / FDMee / Data Integration & EPMi
Posted 2 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will develop an insight driven sensing capability with a focus on revolutionizing decision making. In this role you will lead the technical delivery for this capability as part of a team data engineers and software engineers. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery while building this ground-breaking new capability for Amgen. You will drive the software engineering side of the product release and will deliver for the outcomes. Roles & Responsibilities: Lead delivery of overall product and product features from concept to end of life management of the product team comprising of technical engineers, product owners and data scientists to ensure that business, quality, and functional goals are met with each product release Drives excellence and quality for the respective product releases, collaborating with Partner teams. Impacts quality, efficiency and effectiveness of own team. Has significant input into priorities. Incorporate and prioritize feature requests into product roadmap; Able to translate roadmap into execution Design and implement usability, quality, and delivery of a product or feature Plan releases and upgrades with no impacts to business Hands on expertise in driving quality and best in class Agile engineering practices Encourage and motivate the product team to deliver innovative and exciting solutions with an appropriate sense of urgency Manages progress of work and addresses production issues during sprints Communication with partners to make sure goals are clear and the vision is aligned with business objectives Direct management and staff development of team members What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 8 to 10 years of Information Systems experience OR Bachelors degree and 10 to 14 years ofInformation Systems experience OR Diploma and 14 to 18 years of Information Systems experience Thorough understanding of modern web application development and delivery, Gen AI applications development, Data integration and enterprise data fabric concepts, methodologies, and technologies e.g. AWS technologies, Databricks Demonstrated experience in building strong teams with consistent practices. Demonstrated experience in navigating matrix organization and leading change. Prior experience writing business case documents and securing funding for product team delivery; Financial/Spend management for small to medium product teams is a plus. In-depth knowledge of Agile process and principles. Define success metrics for developer productivity metrics; on a monthly/quarterly basis analyze how the product team is performing against established KPIs. Functional Skills: Leadership: Influences through Collaboration : Builds direct and behind-the-scenes support for ideas by working collaboratively with others. Strategic Thinking : Anticipates downstream consequences and tailors influencing strategies to achieve positive outcomes. Transparent Decision-Making : Clearly articulates the rationale behind decisions and their potential implications, continuously reflecting on successes and failures to enhance performance and decision-making. Adaptive Leadership : Recognizes the need for change and actively participates in technical strategy planning. Preferred Qualifications: Strong influencing skills, influence stakeholders and be able to balance priorities. Prior experience in vendor management. Prior hands-on experience leading full stack development using infrastructure cloud services (AWS preferred) and cloud-native tools and design patterns (Containers, Serverless, Docker, etc.) Experience with developing solutions on AWS technologies such as S3, EMR, Spark, Athena, Redshift and others Familiarity with cloud security (AWS /Azure/ GCP) Conceptual understanding of DevOps tools (Ansible/ Chef / Puppet / Docker /Jenkins) Professional Certifications AWS Certified Solutions Architect (preferred) Certified DevOps Engineer (preferred) Certified Agile Leader or similar (preferred) Soft Skills: Strong desire for continuous learning to pick new tools/technologies. High attention to detail is essential with critical thinking ability. Should be an active contributor on technological communities/forums Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Influences and energizes others toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable Established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Excellent organizational and time-management skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Education and Professional Certifications Masters degree and 3 to 4 + years of Computer Science, IT or related field experience OR Bachelors degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 weeks ago
3.0 - 5.0 years
5 - 10 Lacs
Chennai
Work from Office
Job Summary: We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities: Data Integration & ETL Development o Develop data ingestion, transformation, and deduplication pipelines. o Standardize, cleanse, and validate large-scale customer data. o Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration o Implement unification & deduplication logic for a single customer view. o Develop and manage data masking & encryption for security compliance. API & CI360 Integration o Integrate CMDM with SAS CI360 for seamless campaign execution. o Ensure API connectivity and data flow across platforms. Testing & Deployment o Conduct Unit, Integration, and UAT Testing. o Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required: SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques
Posted 2 weeks ago
4.0 - 6.0 years
14 - 24 Lacs
Hyderabad
Hybrid
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL . Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 46 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure 1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Database Architecting.
Posted 2 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Chennai
Work from Office
About Company Agilysys is well known for its long heritage of hospitality-focused technology innovation. The Company delivers modular and integrated software solutions and expertise to businesses seeking to maximize Return on Experience (ROE) through hospitality encounters that are both personal and profitable. Over time, customers achieve High Return Hospitality by consistently delighting guests, retaining staff and growing margins. Customers around the world include branded and independent hotels; multi-amenity resort properties; casinos; property, hotel and resort management companies; cruise lines; corporate dining providers; higher education campus dining providers; food service management companies; hospitals; lifestyle communities; senior living facilities; stadiums; and theme parks. The Agilysys Hospitality Cloud™ combines core operational systems for property management (PMS), point-of-sale (POS) and Inventory and Procurement (I&P) with Experience Enhancers™ that meaningfully improve interactions for guests and for employees across dimensions such as digital access, mobile convenience, self-service control, personal choice, payment options, service coverage and real-time insights to improve decisions. Core solutions and Experience Enhancers are selectively combined in Hospitality Solution Studios™ tailored to specific hospitality settings and business needs. Agilysys operates across the Americas, Europe, the Middle East, Africa, Asia-Pacific, and India with headquarters located in Alpharetta, GA. For more information visitAgilysys.com. visit Agilysys.com. Requirement & Responsibilty : Proficiency in MongoDB Data Modeling Strong experience with MongoDB Query & Index Tuning Experience with MongoDB Sharding & Replication Troubleshooting MongoDB bottlenecks State-of-the-art MongoDB performance tuning capabilities Respond to incidents and ability to bring them to closure Ensure that the databases achieve maximum performance and availability Recommend and implement best practice Passion for troubleshooting the toughest problems and propose creative solutions Desired Experience : Hospitality Experience.
Posted 2 weeks ago
5.0 - 10.0 years
3 - 8 Lacs
Navi Mumbai
Work from Office
Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Senior SAS Programmers to join our Biostatistics team in India, Mumbai. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Propose and develop specifications for new projects and serve as a project team leader Write SAS programs for use in creating analysis datasets, tables, listings, and figures Using SAS, program, validate and maintain mapped database Program edit checks for external data Responsible for the setup, validation and maintenance of mapped databases, integration of external data with associated edit checks, writing programs independently with good quality for use in creating analysis datasets, tables, listings, and figures. Responsible for mapped database setup, validation and maintenance, and external data integration & edit checks, validation, and maintenance Qualifications Bachelor / Master’s Degree in math, Statistics, health informatics, data science, computer science, or life sciences field 5+ years' eperience with SAS Excellent knowledge of CDISC standards SAS Certification Thorough understanding of the pharmaceutical industry and Federal Regulations regarding electronic records Excellent analytical, written and oral communication skills Good English written/communication skills is required People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
Posted 2 weeks ago
1.0 - 3.0 years
4 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in the design and development of the data pipeline. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks. Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience Masters degree and 1 to 3 years of experience in Computer Science, IT, or related field OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT, or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT, or related field Must-Have Skills: Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing. Proficiency in data analysis tools (e.g., SQL) and experience with data visualization tools. Excellent problem-solving skills and the ability to work with large, complex datasets. Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development. Strong understanding of data modeling, data warehousing, and data integration concepts. Knowledge of Python/R, Databricks, SageMaker, cloud data platforms. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Certified Data Scientist (preferred on Databricks or Cloud environments). Machine Learning Certification (preferred on Databricks or Cloud environments). Soft Skills: Excellent critical-thinking and problem-solving skills. Strong communication and collaboration skills. Demonstrated awareness of how to function in a team setting. Demonstrated presentation skills.
Posted 2 weeks ago
9.0 - 14.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
Our client is Leading global IT Services and Consulting Organization Job Description: Shift Timings : 12:00 noon to 9:30 PM IST Location : HYD & Pune only Role :Architect Key skills required for the job are: Talend DI (Mandatory) and having good exposure to RDBMS databases like Oracle, Sql server. • 3+ years of experience in implementation of ETL projects in a large-scale enterprise data warehouse environment and at least one Successful implementation Talend with DWH is must. • As a Senior Developer, candidate is responsible for development, support, maintenance, and implementation of a complex project module. Candidate is expected to have depth of knowledge of specified technological area, which includes knowledge of applicable processes, methodologies, standards, products and frameworks. • She/he should have experience in application of standard software development principles using Talend. • She/he should be able to work as an independent team member, capable of applying judgment to plan and execute HWB tasks. • Build reusable Talend jobs, routines, and components to support data integration, quality and transformations.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Description: Amgen is seeking a Sr Associate HR Data Analysis (Visier Admin). The Sr Associate HR Data Analysis (Visier Admin) will report to the Associate Director HR Technology. The successful incumbent will have previous Visier reporting tool Admin experience. Roles & Responsibilities: Hands on experience supporting Visier Previous experience with Vee Administrative tasks associated with Visier such as role assignments and creating roles Visier Security configuration, data integration, and data exports Ability to analyze, troubleshoot and resolve Visier data issues Must have previous experience handling large datasets and sensitive HR data Basic Qualifications and Experience: 5 years minimum experience in human resources with hands on experience with Visier Masters degree, OR Bachelors degree and 5 years of HR IS experience Functional Skills: Must-Have Skills: Strong working knowledge of Visier 5+ years experience in human resources and corporate service center supporting Workday Soft Skills: Excellent analytical and troubleshooting skills Strong quantitative, analytical (technical and business), problem solving skills, and attention to detail Strong verbal, written communication and presentation skills Ability to work effectively with global, virtual teams Strong technical acumen, logic, judgement and decision-making Strong initiative and desire to learn and grow Ability to manage multiple priorities successfully Exemplary adherence to ethics, data privacy and compliance policies
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France