Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a PySpark Data Engineer, you must have a minimum of 2 years of experience in PySpark. Strong programming skills in Python, PySpark, and Scala are preferred. It is essential to have experience in designing and implementing CI/CD, Build Management, and Development strategies. Additionally, familiarity with SQL and SQL Analytical functions is required, along with participation in key business, architectural, and technical decisions. There is an opportunity for training in AWS cloud technology. In the role of a Python Developer, a minimum of 2 years of experience in Python/PySpark is necessary. Strong programming skills in Python, PySpark, and Scala are preferred. Experience in designing and implementing CI/CD, Build Management, and Development strategies is essential. Familiarity with SQL and SQL Analytical functions and participation in key business, architectural, and technical decisions are also required. There is a potential for training in AWS cloud technology. As a Senior Software Engineer at Capgemini, you should have over 3 years of experience in Scala with a strong project track record. Hands-on experience in Scala/Spark development and SQL writing skills on RDBMS (DB2) databases are crucial. Experience in working with different file formats like JSON, Parquet, AVRO, ORC, and XML is preferred. Previous involvement in a HDFS platform development project is necessary. Proficiency in data analysis, data profiling, and data lineage, along with strong oral and written communication skills, is required. Experience in Agile projects is a plus. For the position of Data Modeler, expertise in data structures, algorithms, calculus, linear algebra, machine learning, and modeling is essential. Knowledge of data warehousing concepts such as Star schema, snowflake, or data vault for data mart or data warehousing is required. Proficiency in using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models is necessary. Hands-on knowledge and experience with tools like PL/SQL, PySpark, Hive, Impala, and other scripting tools are preferred. Experience with Software Development Lifecycle using the Agile methodology is essential. Strong communication and stakeholder management skills are crucial for this role. In this role, you will design, develop, and optimize PL/SQL procedures, functions, triggers, and packages. You will also write efficient SQL queries, joins, and subqueries for data retrieval and manipulation. Additionally, you will develop and maintain database objects such as tables, views, indexes, and sequences. Optimizing query performance and troubleshooting database issues to improve efficiency are key responsibilities. Collaboration with application developers, business analysts, and system architects to understand database requirements is essential. Ensuring data integrity, consistency, and security within Oracle databases is also a crucial aspect of the role. Developing ETL processes and scripts for data migration and integration are part of the responsibilities. Documenting database structures, stored procedures, and coding best practices is required. Staying up-to-date with Oracle database technologies, best practices, and industry trends is essential for success in this role.,
Posted 3 weeks ago
3.0 - 6.0 years
18 - 30 Lacs
Mumbai
Work from Office
Hello Connections, Greetings from Teamware Solutions !! We are #Hiring for Top Investment Bank Position: Data Analyst Location: Mumbai Experience Range: 3 to 6 Years Notice Period: Immediate to 30 Days Must have skills: data Analyst , data catalog, Data analytics, data governance along with collibra. What You'll Do - As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following 1. Review, analyze, and resolve data quality issues across IM Data Architecture 2. Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions. 3. Coordinate the onboarding of data from various internal / external sources into the central repository. 4. Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls. 5. End-to-end analysis of business processes, data flow s, and data usage to improve business productivity through re-engineering and data governance. 6. Manage change control process and participate in user acceptance testing (UAT) activities. What Were Looking For 1. Minimum 3- 6 years experience in data analysis, data catalog & Collibra. 2. Experience in data analysis and profiling using SQL is a must 3. Knowledge in coding, Python is a plus 4. Experience in working with cataloging tools like Collibra 5. Experience working with BI reporting tools like Tableau, Power BI is preferred. Preferred Qualifications: 1. Bachelors Degree required and any other relevant academic course a plus. 2. Fluent in English Apply now : francy.s@twsol.com
Posted 3 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS(BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS
Posted 3 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BTech,MTech,MCA,MBA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Healthcare Data analyst ,PL/SQL, SQL, Data mapping, STTM creation, Data profiling, Reports Preferred Skills: Domain-Healthcare-Healthcare - ALL
Posted 3 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad
Work from Office
We are looking for a highly skilled Senior Database Developer who can work independently under limited supervision and apply their expertise in database design, development, and maintenance. This role requires a strong background in SQL, relational databases, and data modelling, with a focus on optimizing performance and supporting business intelligence capabilities. Responsibilities: Provide strategic direction and guidance for enterprise data architecture that supports Business Intelligence capabilities. Design and develop conceptual, logical, and physical data models, ensuring optimal performance, scalability, and maintainability. Use profiling tools to identify slow or resource-intensive SQL queries and develop solutions to improve performance. Focus on performance tuning, especially for complex queries, stored procedures, and indexing strategies. Design and implement new features with a focus on scalability and maintainability. Document and define data modeling requirements, ensuring that the applications database design aligns with technical and functional specifications. Ensure significant database design decisions are communicated and validated, adhering to best practices. Ensure the long-term reliability, scalability, and maintainability of systems. Collaborate with cross-functional teams to gather requirements and implement solutions. Assist in the adoption and application of industry best practices and guidelines. Qualifications: Educational Background: Bachelor’s degree or higher in Information Systems, Computer Science, or a related field (or equivalent experience). Experience: 5+ years of experience as a SQL Server Database Developer or Database Administrator (DBA). Technical Skills: Strong expertise in SQL and experience in writing complex SQL queries. Hands-on experience with SQL-XML programming. Extensive experience with SQL Server (Microsoft) and database architectures. Familiarity with performance tuning of SQL queries, stored procedures, and indexing strategies. Knowledge of Data Profiling Tools and performance optimization (CPU/memory/I/O concerns). Experience with Data Modelling and Database Design. Knowledge of ETL tools like Pentaho is a plus. Programming skills in Java and a willingness to explore new languages or transition into a Full-stack Engineer role. Experience with Agile methodologies (preferably SCRUM) and quick delivery through release management. Soft Skills: Strong attention to detail and results-oriented approach. Passionate, intelligent, and a critical thinker with excellent problem-solving skills. Ability to thrive in a fast-paced environment with multiple ongoing projects. Excellent written and verbal communication skills. Collaborative mindset, with the ability to work with all levels of management and stakeholders. Desired Traits: Self-motivated, technical, results-oriented, and quality-focused individual. Strong data warehouse and architecture skills. Excellent problem-solving abilities, proactive with a focus on delivering business value. A team player who is detail-oriented, respectful, and thoughtful.
Posted 3 weeks ago
8.0 - 10.0 years
30 - 32 Lacs
Hyderabad, Ahmedabad, Chennai
Work from Office
Dear Candidate, We are looking for a skilled Data Engineer to design and maintain data pipelines, ensuring efficient data processing and storage. If you have expertise in ETL, SQL, and cloud-based data platforms, wed love to hear from you! Key Responsibilities: Design, develop, and maintain scalable data pipelines. Optimize data workflows for performance and efficiency. Work with structured and unstructured data sources. Implement data governance and security best practices. Collaborate with data scientists and analysts to support data-driven decisions. Ensure compliance with data privacy regulations (GDPR, CCPA). Required Skills & Qualifications: Proficiency in SQL, Python, or Scala for data processing. Experience with ETL tools (Informatica, Apache NiFi, AWS Glue). Hands-on experience with cloud data platforms (AWS, Azure, GCP). Knowledge of data warehousing (Snowflake, Redshift, BigQuery). Familiarity with Apache Spark, Kafka, or Hadoop for big data processing. Soft Skills: Strong problem-solving and analytical skills. Ability to work independently and in a team. Good communication skills to collaborate with stakeholders. Note: If interested, please share your updated resume and your preferred contact details. If shortlisted, our HR team will reach out to you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 3 weeks ago
7.0 - 12.0 years
22 - 27 Lacs
Pune
Hybrid
Job Title: AVP Data Designer Location: Pune Package up to 27 LPA Key Responsibilities Translate business data needs into scalable data models, schemas, and flows. Lead the design and implementation of logical and physical data models across platforms. Conduct data profiling and quality analysis to ensure data integrity. Collaborate with cross-functional teams to define data requirements and ensure smooth integration with existing systems. Maintain and update metadata, data dictionaries, and design specifications. Support the banks Data & Analytics strategy by enabling use-case driven data solutions. Ensure data solutions comply with governance, risk, and security frameworks. Optimize data structures for performance, scalability, and business insight generation. Must-Have Skills 58 years of experience in data design , data modeling , or data architecture . Proficiency in SQL and working with databases like Oracle, MySQL, SQL Server . Hands-on experience with Kafka , AWS , or other cloud/data streaming platforms. Strong understanding of data profiling , quality checks , and remediation. Excellent communication skills — ability to work with both technical and non-technical teams. Nice-to-Have Bachelor’s degree in Data Science , Computer Science , or related field. Knowledge of data warehousing and ETL concepts . Experience in the financial services or financial crime domain . Familiarity with data governance tools and frameworks . Exposure to tools like Power BI , Tableau , or data catalog platforms. For more details call Kanika on 9953939776 or email resume to kanika@manningconsulting.in
Posted 3 weeks ago
0.0 - 3.0 years
2 - 6 Lacs
Pune
Work from Office
About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.
Posted 3 weeks ago
0.0 - 3.0 years
2 - 6 Lacs
Mumbai
Work from Office
About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.
Posted 3 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Gurugram
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database technologies and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Gurugram office.- A 15 years of full time education is required. Qualification 15 years of full time education
Posted 3 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Gurugram
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Gurugram office.- A 15 years of full time education is required. Qualification 15 years of full time education
Posted 3 weeks ago
4.0 - 8.0 years
8 - 12 Lacs
Pune
Work from Office
Template Job Title - Decision Science Practitioner Analyst S&C GN Management Level :Senior Analyst Location:Bangalore/ Kolkata Must have skills: Collibra Data Quality - data profiling, anomaly detection, reconciliation, data validation, Python, SQL Good to have skills: PySpark, Kubernetes, Docker, Git Job Summary : We are seeking a highly skilled and motivated Data Science cum Data Engineer Senior Analyst to lead innovative projects and drive impactful solutions in domains such as Consumer Tech , Enterprise Tech , and Semiconductors . This role combines hands-on technical expertise , and client delivery management to execute cutting-edge projects in data science & data engineering Key Responsibilities Data Science and Engineering Implement and manage end to end Data Quality frameworks using Collibra Data Quality (CDQ). This includes requirement gathering from the client, code development on SQL, Unit testing, Client demos, User acceptance testing, documentation etc. Work extensively with business users, data analysts, and other stakeholders to understand data quality requirements and business use cases. Developdata validation, profiling, anomaly detection, and reconciliation processes. WriteSQL queries for simple to complex data quality checks . Python, and PySpark scripts to support data transformation and data ingestion. Deploy and manage solutions on Kubernetes workloads for scalable execution. Maintain comprehensivetechnical documentation of Data Quality processes and implemented solutions. Work in an Agile environment , leveraging Jira for sprint planning and task management. Troubleshoot data quality issues and collaborate with engineering teams for resolution. Provide insights forcontinuous improvement in data governance and quality processes. Build and manage robust data pipelines using Pyspark and Python to read and write from databases such as Vertica and PostgreSQL. Optimize and maintain existing pipelines for performance and reliability. Build custom solutions using Python, including FastAPI applications and plugins for Collibra Data Quality. Oversee the infrastructure of the Collibra application in Kubernetes environment, perform upgrades when required, and troubleshoot and resolve any Kubernetes issues that may affect the application's operation. Deploy and manage solutions, optimize resources for deployments in Kubernetes, including writing YAML files and managing configurations Build and deploy Docker images for various use cases, ensuring efficient and reusable solutions. Collaboration and Training Communicate effectively with stakeholders to align technical implementations with business objectives. Provide training and guidance to stakeholders on Collibra Data Quality usage and help them build and implement data quality rules. Version Control and Documentation Use Git for version control to manage code and collaborate effectively. Document all implementations, including data quality workflows, data pipelines, and deployment processes, ensuring easy reference and knowledge sharing. Database and Data Model Optimization Design and optimize data models for efficient storage and retrieval. Required Qualifications Experience:4+ years in data science Education:B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Industry Knowledge:Preferred experience in Consumer Tech, Enterprise Tech, or Semiconductors but not mandatory Technical Skills Programming: Proficiency in Python , SQL for data analysis and transformation. Tools :Hands-on experience with Collibra Data Quality (CDQ) or similar Data Quality tools (e.g., Informatica DQ, Talend, Great Expectations, Ataccama, etc.). Experience working with Kubernetes workloads. Experience with Agile methodologies and task tracking using Jira. Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication, stakeholder management & requirement gathering capabilities. Additional Information: - The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients - This position is based at our Bengaluru (preferred) and Kolkata office. About Our Company | AccentureQualification Experience: 4+ years in data science Educational Qualification:B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field
Posted 3 weeks ago
12.0 - 15.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage in strategic planning and decision-making processes, ensuring that the applications align with organizational objectives and user needs. Your role will require you to balance technical expertise with leadership skills, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training and knowledge-sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database technologies and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 12 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Hybrid
Job Title: Data Governance & Quality Specialist Experience: 3-8 Years Location: Bangalore (Hybrid) Domain: Financial Services Notice Period: Immediate to 30 Days What Youll Do: Define and enforce data-governance policies (BCBS 239/GDPR) across credit-risk datasets Design, monitor and report on data-quality KPIs; perform profiling & root-cause analysis in SAS/SQL Collaborate with data stewards, risk teams and auditors to remediate data issues Develop governance artifacts: data-lineage maps, stewardship RACI, council presentations Must Have: 38 yrs in data-governance or data-quality roles (financial services) Advanced SAS for data profiling & reporting; strong SQL skills Hands-on with governance frameworks and regulatory requirements Excellent stakeholder-management and documentation abilities Nice to Have: Experience with Collibra, Informatica or Talend Exposure to credit-risk model inputs (PD/LGD/EAD) Automation via SAS macros or Python scripting If Interested, please share your resume to sunidhi.manhas@portraypeople.com
Posted 3 weeks ago
1.0 - 7.0 years
6 - 7 Lacs
Hyderabad
Work from Office
Job Description: Senior/Azure Data Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 3 weeks ago
4.0 - 7.0 years
15 - 25 Lacs
Bengaluru
Work from Office
The Opportunity Nutanix is a global leader in cloud software and a pioneer in hyper-converged infrastructure solutions, making clouds invisible and freeing customers to focus on their business outcomes. Organizations worldwide use Nutanix software to leverage a single platform to manage any app at any location for their hybrid multi-cloud environments Your Role Should be at least around 4+ years of experience in MDM development and implementation Should have completed at least 2 full life cycle experiences in MDM/DG projects using any tools like Informatica Strong on SQL and experience in Informatica MDM/DG. Must have hands-on experience across the various phases of a typical DQM solution - data profiling, data integration, validation, cleansing, standardization, matching, consolidation etc. Should have an understanding and experience of software development best practices Excellent business communication, Consulting, and Quality process skills Understand and assess the Source to target map documents and provide the recommendations if any Manage Enhancement on Informatica MDM/Oracle/SQL server coding skills Experience working independently, efficiently, and effectively under tight timelines and delivering results by critical deadlines Strong analytical and problem-solving skills What You Will Bring At least 7+ years of experience in configuring and designing Informatica MDM versions 10+ 7+ years of relevant data management consulting or industry experience (multiple master data domains, data modeling, data quality, and governance) 7+ years in-depth experience with Informatica MDM multi-domain edition and/ or C360 Bachelors Degree or 10+ years equivalent professional experience Should have the ability to configure complex UI for Informatica MDM using the Provisioning tool or C360, including hierarchies Should be able to develop complex MDM Services and user exits using Java Deep understanding of MDM upstream and downstream integration Experience in Pub/Sub model and/ or other integration patterns Knowledgeable in Informatica Power Center/ Data Integration and Informatica Data Quality Experience in ActiveVOS workflow management/ Application Integration is a must Strong knowledge of SQL with Postgres, Oracle, or SQL Server, with an ability to write complex queries, develop functions and stored procedures Knowledge of Data sources in the Account & Contact domains Excellent troubleshooting skills
Posted 3 weeks ago
3.0 - 6.0 years
10 - 14 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati
Posted 4 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Number of Openings* 1 Approved ECMS* Yes Duration of contract* 6 Months Total Yrs. of Experience* 5 8 years Relevant Yrs. of experience* >4 years Detailed JD *(Roles and Responsibilities) PFB, table Mandatory skills* Pyspark, Azure databricks(PFB, table) Desired skills* Pyspark, Azure databricks, SQL Domain Anything Vendor billing rate* (INR /Day) 5500 INR/Day Work Location* Only Pune Background check process to be followed: Pre/Post/Hybrid Post Mode of Interview: Telephonic/Face to Face/Skype Interview Skype & F2F WFO / WFH / Hybrid WFO Any Certification (Mandatory) No Shift Time General Business travel required (Yes / No) No Job Description Category Essential for this role Good to have Education and Qualifications Bachelor or Master or Doctorate degree in maths, statistics, computer science, information management Work Experience 5-8 years experience working in Data Engineering and Data Warehousing. Technical / Professional Skills Please provide at least 3 Hands On with advanced SQL, Python etc. Hands On in data profiling Hands On in working on Cloud like Azure and Cloud DW like Databricks Hands On experience on Scheduling tools like Airflow, Control M etc. Knowledgeable on Big Data tools like Spark (python/scala), Hive, Impala, Hue and storage (e.g. HDFS, HBase) Knowledgeable in CICD processes BitBucket/GitHub, Jenkins, Nexus etc. Knowledgeable managing structured and unstructured data types. Knowledgeable in building Microservices, Rest APIs, Data as a Service architectures. Knowledgeable in streaming frameworks (Kafka/Spark Streaming) is preferred Non-Technical / Soft Skills Experience in working with a team of data engineers working from different locations in different time zones. Excellent communication. Effective prioritisation Pragmatic stakeholder management. Other Task-Specific Knowledge Proficient in Excel, Word, and PowerPoint. Ability to prioritise and manage to multiple concurrent activities
Posted 4 weeks ago
4.0 - 7.0 years
2 - 7 Lacs
Pune
Remote
Business Informatica Consultant Seeking highly experienced Business Informatica Consultants to join our data team in India. The ideal candidates will have 3-7 years of experience in delivering data solutions to clients in the areas of data quality, data profiling, and data integration using Informatica Data Quality (IDQ) in an AWS environment. The candidates must possess strong AWS technical skills and a deep understanding of Informatica Data Quality and Informatica PowerCenter products, and hands-on experience in designing, implementing, and supporting these tools in AWS environment. Key Responsibilities: Engage with client stakeholders to understand the requirements and own the end-to-end solution and delivery. Design, develop, deploy, and support end-to-end Informatica IDQ solutions to support enterprise-level data quality initiatives. Integrate IDQ with Informatica PowerCenter, MDM, and other ETL/BI tools as needed. Create and maintain technical documentation, including design specifications and test plans. Lead performance tuning and troubleshoot complex issues in IDQ workflows and mappings. Ensure compliance with enterprise architecture standards, policies, and best practices. Required Skills & Qualifications: 3-7 years of IT experience with at least 3+ years in Informatica IDQ. Experience in cloud-based data platforms (AWS). Experience integrating IDQ with Informatica PowerCenter or MDM is a plus. Strong analytical and problem-solving skills. Experience working in Agile/Scrum environments. Excellent communication and interpersonal skills. Preferred Qualifications: Informatica certifications (IDQ/PowerCenter). Exposure to data governance frameworks and tools
Posted 4 weeks ago
2.0 - 5.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Software Engineering Advisor Fullstack (API Developer ) Position Overview Developer with 11 -1 3 years of strong design and developer experience to build robust APIs and services using Java and Spring Boot , coupled with hands-on experience in data processing . H as knowledge and experience to design and implement scalable On Prem / C loud solutions that efficiently manage and leverage large datasets . P roficient in Java / Spring Boot with demonstrated ability to integrate with different database s and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Perform code reviews and implement suggest improvements. Responsibilities Design, develop, and maintain API s using Java and Spring B oot and ensure efficient data exchange between applications. Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. De velop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Carry out Data Profiling of Source data and generate logical data models (as required or applicable). Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and Integration tests. Qualifications Required Skills : Programming Languages: Proficiency in Java. Web Development: Experience with SOAP and RESTful services. Database Management: Strong knowledge of SQL (Oracle). Version Control: Expertise in using version control systems like Git. CI/CD: Familiarity with CI/CD tools such as GitLab CI and Jenkins. Containerization Orchestration: Experience with Docker and OpenShift. Messaging Queues: Knowledge of IBM MQ and Apache Kafka. Cloud Services: Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata , etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies . Required Experience Education: Must have 11 - 13 years of hands-on development experience Extensive e xperience developing and maintaining API s Experience managing and/or leading a team of developers. Working knowledge of Data Modelling, solution architecture, normalization, data profiling etc. Adherence to good coding practices, technical documentation, and must be a good team player. Desired Skills Analytical Thinking: Ability to break down complex problems and devise efficient solutions. Debugging: Skilled in identifying and fixing bugs in code and systems. Algorithm Design: Proficiency in designing and optimizing algorithms. Leadership: Proven leadership skills with experience mentoring junior engineers. Communication: Strong verbal and written communication skills. Teamwork: Ability to collaborate effectively with cross-functional teams. Time Management: Competence in managing time and meeting project deadlines. Education Bachelor s degree in Computer Science , Software Engineering, or related field. A Masters degree is a plus. Certifications: Relevant certifications in AWS a plus Location Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 4 weeks ago
5.0 - 7.0 years
19 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles. Qualifications: 4+ years of experience in ETL & Data Warehousing Should have excellent leadership & communication skills Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid-level experience in Pyspark and Teradata Should Have mid-level experience in Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. Certifications: Google Cloud Professional Data Engineer Certification. Roles and Responsibilities Analyze the different source systems, profile data, understand, document & fix Data Quality issues Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code and deploy across environments Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests Do production support, enhancements and bug fixes Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem
Posted 4 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
JD for SAP BODS. Role Description: SAP BusinessObjects Data Services Competencies: Digital : SAP BusinessObjects Data Services Experience (Years): 4-6 Essential Skills: SAP BusinessObjects Data Services Job Description Key Responsibilities: Design, develop, test, and deploy ETL processes using SAP BODS . Extract, transform, and load data from SAP and non-SAP sources to target systems (e.g., HANA, BW, Oracle, SQL Server). Develop complex dataflows, workflows, and job scheduling within BODS. Perform data profiling, cleansing, transformation, and validation activities. Optimize ETL performance and ensure scalability and reliability of jobs. Collaborate with business analysts, data architects, and project managers to understand data requirements. Support data migration and data warehousing projects. Create and maintain technical documentation for ETL processes and data mappings. Troubleshoot and resolve issues related to data inconsistencies, job failures, and performance bottlenecks.
Posted 1 month ago
3.0 - 7.0 years
25 - 27 Lacs
Noida, Hyderabad
Work from Office
Position Summary MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science. Role Value Proposition The Business analyst data modeler is an important role in the data and analytics (D&A) organization. The role ensures, data is structured, organized and is represented effectively aligned to the needs of the organization. The role helps design logical & physical model which include implementation of robust data models that accurately capture, store, and manage data end to end. Job Responsibilities Perform data modeling activity (Logical, Physical) using data modeling tool CA Erwin Data Modeler Ability to gather, understand & analyze business requirements accurately Ability to analyze data using SQL Partnering with other teams to understand data needs & translating it into effective data models. Ability to collaborate with stakeholders to provide domain-based solutions. Experience in implementing industry data modeling standards, best practices, and emerging technologies in data modeling Hands-on experience with API development and integration (REST, SOAP, JSON, XML). Education, Technical Skills & Other Critical RequirementEducation Bachelors degree in computer science, Engineering, or a related tech or business discipline Experience (In Years) 3-5 years Hands on experience in CA Erwin 3-5 years experience in SQL, data modelling and data analysis Ability to communicate effectively Understanding of API Design and Ingestion Ability to communicate effectively both orally and in writing with various levels of management, including translating complex ideas and data into actionable steps for business units Technical Skills Strong SQL skills Hands-on experience with ERWIN tool Familiarity with Agile best practices Strong collaboration and facilitation skills Other Preferred Skills Familiarity with Azure cloud Experience in a client-facing role/environment.
Posted 1 month ago
5.0 - 7.0 years
4 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Informatica MDM professional with 5 to 7 years of experience. The ideal candidate will have expertise in defining data models and architectures, configuring MDM solutions, and designing and developing BES UI. Roles and Responsibility Define data models and architecture for MDM solutions. Configure MDM (Base Objects, Stg tables, Match & Merge rules, Hierarchies, Relationship objects). Design and develop BES UI. Design and develop C360 applications for data stewards according to client needs. Define data migration processes from legacy systems during M&A activities to MDM systems. Support and maintain MDM applications. Job Minimum 5 years of experience in Informatica MDM. Strong knowledge of data modeling and architecture. Experience in configuring MDM solutions and designing BES UI. Ability to define data migration processes. Strong understanding of data stewardship concepts. Excellent problem-solving skills and attention to detail.
Posted 1 month ago
6.0 - 10.0 years
12 - 16 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Informatica Architect Job Type: Full-time, Contractor Location: HybridBengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders Experience collaborating with diverse teams to deliver robust data governance and analytics solutions Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms Hands-on experience implementing open-source data catalog tools in enterprise environments
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough