Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
3 - 7 Lacs
Pune
Work from Office
Job Title: ETL Tester. Job Type: Full-time. Location: On-site Hyderabad, Pune or New Delhi. Job Summary:. Join our customer’s team as a dedicated ETL Tester where your expertise will drive the quality and reliability of crucial business data solutions. As an integral part of our testing group, you will focus on ETL Testing while engaging in automation, API, and MDM testing to support robust, end-to-end data validation and integration. We value professionals who demonstrate strong written and verbal communication and a passion for delivering high-quality solutions.. Key Responsibilities:. Design, develop, and execute comprehensive ETL test cases, scenarios, and scripts to validate data extraction, transformation, and loading processes.. Collaborate with data engineers, business analysts, and QA peers to clarify requirements and ensure accurate data mapping, lineage, and transformations.. Perform functional, automation, API, and MDM testing to support a holistic approach to quality assurance.. Utilize tools such as Selenium to drive automation efforts for repeatable and scalable ETL testing processes.. Identify, document, and track defects while proactively communicating risks and issues to stakeholders with clarity and detail.. Work on continuous improvement initiatives to enhance test coverage, efficiency, and effectiveness within the ETL testing framework.. Create and maintain detailed documentation for test processes and outcomes, supporting both internal knowledge sharing and compliance requirements.. Required Skills and Qualifications:. Strong hands-on experience in ETL testing, including understanding of ETL tools and processes.. Proficiency in automation testing using Selenium or similar frameworks.. Experience in API testing, functional testing, and MDM testing.. Excellent written and verbal communication skills, with an ability to articulate technical concepts clearly to diverse audiences.. Solid analytical and problem-solving abilities to troubleshoot data and process issues.. Attention to detail and a commitment to high-quality deliverables.. Ability to thrive in a collaborative, fast-paced team environment on-site at Hyderabad.. Preferred Qualifications:. Prior experience working in large-scale data environments or within MDM projects.. Familiarity with data warehousing concepts, SQL, and data migration best practices.. ISTQB or related QA/testing certification.. Show more Show less
Posted 1 week ago
2.0 - 5.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less
Posted 1 week ago
3.0 - 4.0 years
11 - 14 Lacs
Mumbai
Work from Office
AEP Data Architect. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Key Responsibilities. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Requirements & Qualifications. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Show more Show less
Posted 1 week ago
6.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary The Sr. Developer role involves designing developing and maintaining data integration solutions using Abinitio admin and other ETL tools. The candidate will work on data warehousing projects ensuring efficient data processing and integration. This position requires a strong understanding of data warehousing concepts and proficiency in SQL and Unix Shell Scripting. The role is hybrid with no travel required. Responsibilities Design and develop data integration solutions using Ab Initio tools to ensure seamless data processing and transformation. Collaborate with cross-functional teams to gather and analyze requirements for data warehousing projects. Implement ETL processes to extract transform and load data from various sources into data warehouses. Optimize SQL queries to enhance performance and ensure efficient data retrieval and manipulation. Utilize Unix Shell Scripting for automation and scheduling of data processing tasks. Monitor and troubleshoot data integration workflows to ensure data accuracy and integrity. Provide technical support and guidance to team members on data warehousing and ETL best practices. Conduct regular reviews of data integration processes to identify areas for improvement and implement necessary changes. Ensure compliance with data governance and security standards in all data integration activities. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Develop and maintain documentation for data integration processes and workflows. Stay updated with the latest trends and technologies in data warehousing and ETL to drive innovation. Contribute to the companys mission by enabling data-driven decision-making through robust data integration solutions. Qualifications Demonstrate expertise in Data Warehousing Concepts and Scheduling Basics to design effective data solutions. Possess strong skills in ETL and SQL to manage data extraction and transformation processes efficiently. Show proficiency in Ab Initio GDE Conduct>It and Co>Operating System for advanced data integration tasks. Have experience in Unix Shell Scripting to automate and streamline data workflows. Nice to have domain experience in Telecom to understand industry-specific data requirements. Exhibit ability to work in a hybrid model balancing remote and on-site tasks effectively. Display strong analytical and problem-solving skills to address data integration challenges. Certifications Required Ab Initio Certification SQL Certification
Posted 1 week ago
4.0 - 9.0 years
10 - 20 Lacs
Hyderabad, Bengaluru
Hybrid
Job Title: Informatica Data Quality (IDQ) Specialist Experience: 4+ Years (Relevant) Location: Hyderabad / Bangalore Notice Period: Immediate to 15 Days Job Summary: We are seeking an experienced Informatica Data Quality (IDQ) specialist with a solid background in IDQ, IDE, PowerCenter , and SQL/PLSQL . The ideal candidate will have a proven track record of at least three end-to-end IDQ implementations , strong data quality expertise, and hands-on experience with real-time and batch data integration. Key Responsibilities: Design and implement end-to-end IDQ solutions including profiling, cleansing, standardization, matching, and monitoring. Perform admin/configuration tasks for Informatica IDQ and Informatica Data Explorer (IDE). Integrate IDQ with upstream/downstream systems using real-time and batch processing. Develop complex mappings in Informatica PowerCenter to support ETL workflows. Work with Address Validation Tools such as Address Doctor, Trillium, etc. Tune performance for match/merge logic and resolve related issues. Implement data profiling and data quality dashboards for ongoing monitoring and improvements. Collaborate with data architects, analysts, and stakeholders for data governance and quality initiatives. Required Skills: Minimum 4+ years of experience in Informatica IDQ and IDE . Strong working knowledge of ETL tools like Informatica PowerCenter. Hands-on experience with SQL and PL/SQL ; Java knowledge is a plus. Proficient in data quality principles, strategies, and tools in the market. Experience integrating IDQ with other systems (batch and real-time). Familiarity with address validation/cleansing tools . Solid understanding of DBMS concepts and data warehousing fundamentals. Ability to troubleshoot and optimize IDQ performance issues. Good to Have: Experience in data governance frameworks . Exposure to Java-based integration with Informatica. Certification in Informatica IDQ or related tools. Education: Bachelors degree in Computer Science, Information Technology, or a related field. If Interested Please share your resume to subashini.gopalan@kiya.ai
Posted 1 week ago
4.0 - 8.0 years
13 - 23 Lacs
Bengaluru
Hybrid
Data Engineer Job Details Position Title: Data Engineer II Job Category: Senior Associate Role Type: Hybrid Job Location: Bangalore Interview Mode: F2F Exp : 4-7 Years Key skills: Snowflake(min 2 years), ETL tools(Informatica/BODS/Datastage), Scripting (Python/Powershell/Shell), SQL, Data Warehousing. Add-on : AWS-glue, Data Lake, Data Modelling etc. ONLY CANDIDATES AVAILABLE FOR FACE TO FACE INTERVIEW CAN APPLY(BANGALORE LOCATION) Kindly share your updated CV to radhika@theedgepartnership.com Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: Expertise in design and development of complex data pipelines Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Knowledge of ELT tools such as DBT, Fivetran, and AWS Glue Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Data Model: Knowledge of Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. Performance tuning of data pipelines and DB Objects to deliver optimal performance. Experience in Gitlab version control and CI/CD processes. Experience working in Financial Industry is a plus.
Posted 1 week ago
4.0 - 9.0 years
11 - 14 Lacs
Hyderabad
Work from Office
Overview iCIMS is seeking a Financial Analyst to join our FP&A team, based in India and reporting directly to the Vice President of FP&A. This highly visible role will focus on project-based financial analysis across revenue, expense, and operational/corporate reporting, serving as a force multiplier for the broader FP&A team by creating scalable solutions, driving automation, ad-hoc project work and enhancing reporting efficiency. The ideal candidate brings a strong blend of financial acumen and technical expertise, thrives on solving complex problems, and is passionate about enabling others through better data and tools. This role will work partially overlapping with U.S. business hours to collaborate effectively with global stakeholders. Responsibilities Act as a technical enabler for the FP&A team by developing tools, templates, and scalable processes to enhance reporting, analytic, forecasting and planning activities Integrate data from multiple systems (Tableau, Salesforce, NetSuite, Adaptive Insights, Excel) into consolidated reports and dashboards Build and maintain automated reporting solutions using Excel (including VBA/macros), Tableau, Adaptive and other reporting tools to streamline workflows and improve data accessibility Lead and deliver financial projects spanning revenue, expense, and operational/corporate reporting, ensuring solutions align with business priorities Identify and implement process improvements to eliminate manual work, accelerate reporting timelines and reduce errors Collaborate closely with finance, accounting, and operational stakeholders to understand reporting needs and proactively develop solutions Support monthly close, forecast, and long-range planning processes through development of reusable reporting models and automation Translate complex financial and operational data into actionable insights for executive leadership Maintain high standards of data accuracy, process consistency, and documentation across reporting deliverables Qualifications 4+ years of experience in FP&A, business analysis, financial systems, or a related analytical role Expert-level Excel skills, including advanced formulas, data modeling, and VBA/macro development Proficiency with data visualization tools (Tableau, Power BI, or similar) Strong analytical and problem-solving abilities with a continuous improvement mindset Experience working with financial systems such as NetSuite, Adaptive Insights, Salesforce, or similar platforms Excellent communication skills, with the ability to translate complex data into clear insights for diverse stakeholders Proven ability to manage multiple complex deliverables simultaneously Proven ability to work independently while collaborating across global teams Ability and willingness to work overlapping U.S. business hours as required for collaboration Education/Certifications/Licenses: Bachelor’s degree in Finance, Accounting, Business Analytics, Computer Science, Information Systems, or a related field
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad, Bengaluru
Hybrid
Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAMto monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). What You'll Bring to Our Team Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.
Posted 1 week ago
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Job Description We are seeking a highly skilled and experienced Data Architect to design, implement, and manage the data infrastructure As a Data Architect, you will play a key role in shaping the data strategy, ensuring data is accessible, reliable, and secure across the organization You will work closely with business stakeholders, data engineers, and analysts to develop scalable data solutions that support business intelligence, analytics, and operational needs, Key Responsibilities Design and implement effective database solutions (on-prem / cloud) and data models to store and retrieve data for various applications within FinCrime domain, Develop and maintain robust data architecture strategies aligned with business objectives, Define data standards, frameworks, and governance practices to ensure data quality and integrity, Collaborate with data engineers, software developers, and business stakeholders to integrate data systems and optimize data pipelines, Evaluate and recommend tools and technologies for data management, warehousing, and processing, Create and maintain documentation related to data models, architecture diagrams, and processes, Ensure data security and compliance with relevant regulations (e-g , GDPR, HIPAA, CCPA), Participate in capacity planning and growth forecasting for the organizations data infrastructure, Through various POCs, assess and compare multiple tooling options and deliver use-cases based on MVP model as per expectations, Requirements Experience: 10+ years of experience in data architecture, data engineering, or related roles, Proven experience with relational and NoSQL databases, Experience with FinCrime domain applications and reporting, Strong experience with ETL tools, data warehousing, and data lake solutions, Familiarity with other data technologies such as Spark, Kafka, Snowflake, Skills Strong analytical and problem-solving skills, Proficiency in data modelling tools (e-g , ER/Studio, Erwin), Excellent understanding of database management systems and data security, Knowledge of data governance, metadata management, and data lineage, Strong communication and interpersonal skills to collaborate across teams, Subject matter expertise within the FinCrime, Preferred Qualifications
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Pune
Work from Office
What You'll Do We are seeking a highly skilled and motivated Senior Data Engineer to join our Data Operations team The ideal candidate will have deep expertise in Python, Snowflake SQL, modern ETL tools, and business intelligence platforms such as Power BI This role also requires experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs You will be responsible for building and maintaining data pipelines, developing robust data models, and ensuring seamless data integrations that support business analytics and reporting The role requires flexibility to collaborate in US time zones as needed, What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python, Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs, Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors, Develop and support dashboards and reports using Power BI and other reporting tools, Work closely with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions, Ensure data quality, accuracy, and consistency across systems and datasets, Write clean, well-documented, and testable code with a focus on performance and reliability, Participate in peer code reviews and contribute to best practices in data engineering, Be available for meetings and collaboration in US time zones as required, What Youll Need To Be Successful You should have 5+ years' experience in data engineering field, with deep SQL knowledge, Strong experience in Snowflake SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must, Proficiency in Python for data transformation and scripting, Proficiency in writing complex SQL queries, Stored Procedures, Strong experience in Data Warehouse, data modeling and ETL design concepts, Should have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP, etc Knowledge of AWS technologies (EC2, S3, RDS, Redshift, etc ) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders, Flexibility to work during US business hours as required for team meetings and collaboration, How Well Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses, Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance, Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship, What You Need To Know About Avalara Were Avalara Were defining the relationship between tax and tech, Weve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business Our growth is real, and were not slowing down until weve achieved our mission to be part of every transaction in the world, Were bright, innovative, and disruptive, like the orange we love to wear It captures our quirky spirit and optimistic mindset It shows off the culture weve designed, that empowers our people to win Ownership and achievement go hand in hand here We instill passion in our people through the trust we place in them, Weve been different from day one Join us, and your career will be too, Were An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company ? we dont want people to fit into our culture, but to enrich it All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law If you require any reasonable adjustments during the recruitment process, please let us know,
Posted 1 week ago
5.0 - 10.0 years
2 - 3 Lacs
Kolkata, Ramgarh
Work from Office
Sodexo Food Solutions India Pvt. Ltd.cesHR Cum MIS Executive to join our dynamic team and embark on a rewarding career journey Sound Knowledge hands on experience on H-look Up, V-Look Up, Pivot Table, Conditional Formatting etc Good in preparing MIS Report Perform data analysis for generating reports on periodic basis Provide strong reporting and analytical information support Knowledge of various MIS reporting tools
Posted 1 week ago
15.0 - 20.0 years
25 - 32 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities 15+ years' experience in handling projects ab-initio. He / She must have Strong Technical experience with Microsoft technologies .Net, MS-SQL Server, TFS, Windows Server, BizTalk etc. The candidates should. have strength in technology, domain, and application development and possess leadership qualities to lead a team of minimum 40-50 professionals. Responsibility Areas:- Provide leadership role in the areas of advanced data techniques, including data quality, data governance, data modeling, data access, data integration, data visualization, data discovery,, database design and. implementation; Lead the overall strategy and.. Roadmap for data architecture. Partner with the project organization, solution architecture, and engineering to ensure best use of standards for the key data use cases I patterns tech standards. Analyze Information Technology landscape to identify gaps and recommend improvements. Create and maintain the Enterprise Data Model at the 'Conceptual, Logical and Physical Level: Steward of Enterprise Metadata Architecture & Standards, Data Lifecycle Management including data quality, data conversion, and data. security technologies. Define and achieve the strategy roadmap - for the enterprise data; including data modeling, implementation and data Management for our enterprise data, warehouse and advanced data analytics systems. Develop and document enterprise data standards and provides technical oversight on projects to ensure compliance through the adoption and promotion of industry standards / best practice guiding principles aligned with Gartner, TOGA:F, :Forrester and the like. Create architectural technology and business roadmaps that result in stronger business/VI-alignment and drive adoption and usage of technology across the . enterprise. Align portfolio of projects: to the roadmaps and reference architecture. Define and enforce architecture principles, standards, metrics and policies. Provide leadership in architecture, design and build of complex applications and perform architectural design reviews. Manage the development of transition plans for moving from • the current to the future state environment across application portfolio: Collaborate with both IT and business to influence decisions in technology investments. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Support SYstem.* Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces. Support modifications to existing software to improve efficiency and performance. Professional Qualification:- B.Tech/ BE/ MCA/ M.Tech/ ME/ Phd in Computer Science/Information. Technology (IT) and related fields or equivalent with consistently good academic record. Preferred Professional Qualification/Certification PMP and- Equivalent, CGEFT, Mt (Foundation), PM Tool, Microsoft certifications MS-SQL, BizTalk, Net Interested candidates share your resume at parul@mounttalent.com/ parul.s@mounttalent.com.
Posted 1 week ago
2.0 - 5.0 years
6 - 13 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Software Developer II: Oracle Data Integrator (ODI) Bangalore, Hyderabad, Chennai, Mumbai, Pune, Gurgaon, Kolkata LOCATION EXPERIENCE 2-5 years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Software Developer II: Oracle Data Integrator (ODI) OVERVIEW OF THE ROLE: We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud (OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing, and optimizing cutting-edge data integration and analytics solutions. Your contributions will be pivotal in enhancing data-driven decision-making and delivering actionable insights across the organization. HASHEDIN BY DELOITTE 2025 Key Responsibilities: ¢ Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. ¢ ¢ Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. ¢ ¢ ¢ Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Technical Skills: ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. ¢ ¢ ¢ ¢ Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc ¢ ¢ Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. © HASHEDIN BY DELOITTE 2025 ¢ Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle ¢ ¢ ¢ ¢ ¢ Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Low level design documents. ¢ Ability to work with minimal guidance or supervision in a time critical environment. Experience: ¢ ¢ ¢ 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: ¢ Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). ¢ ¢ ¢ Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: ¢ Bachelors degree in Computer Science, Information Technology, Engineering, or related field. © HASHEDIN BY DELOITTE 2025
Posted 1 week ago
4.0 - 9.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
About the Role: We are seeking a highly skilled and experienced Contract Implementation Analyst to join our team on a contractual basis. The ideal candidate will have a strong background in financial services and a deep understanding of reconciliation processes. You will be responsible for implementing our enterprise reconciliation product, ensuring seamless integration with client systems, and providing ongoing support. Key Responsibilities: Solution Implementation: Analyze client requirements to determine the optimal configuration of our product reconciliation. Design and implement customized solutions tailored to specific client needs. Configure and integrate the product with various systems and data sources. Test and validate the implemented solutions to ensure accuracy and performance. Client Onboarding: Conduct thorough onboarding sessions with clients to understand their business processes and pain points. Provide comprehensive training and support to clients on product usage and best practices. Technical Expertise: Leverage your strong technical skills to troubleshoot and resolve complex issues. Stay up to date with the latest industry trends and technologies. Required Skills and Experience: A bachelors or higher degree in Computer Science, Computer Engineering, or similar discipline. 4+ years of experience in financial services, preferably in reconciliations space. Deep understanding of financial concepts, including reconciliation, matching, and exception management. Proven expertise in implementing reconciliation solutions. Strong technical skills, including SQL, ETL, and data analysis. Experience with ISO20022 standards is a plus. Familiarity with TLM, Duco, and IntelliMatch is a significant advantage. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Knowledge in Cloud (AWS
Posted 1 week ago
5.0 - 9.0 years
4 - 7 Lacs
Gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 1 week ago
3.0 - 8.0 years
2 Lacs
Hyderabad
Work from Office
Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse Implement best practices for data loading, ensuring optimal performance and data quality Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements Work on data modeling and schema design to optimize database structures for ETL processes Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading Troubleshoot and resolve issues related to data integration and performance bottlenecks Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions Provide guidance and mentorship to junior members of the data engineering team Create and maintain comprehensive documentation for ETL processes, data models, and data flows Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows Use Jira for task tracking and project management Implement data quality checks and validation processes to ensure data integrity and reliability Maintain detailed documentation of data engineering processes and solutions Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e g , Informatica Cloud, Talend, Apache NiFi) Expertise in IDMC principles, including data governance, data quality, and metadata management Solid understanding of data warehousing concepts and practices Strong SQL skills and experience working with relational databases Excellent problem-solving and analytical skills Qualified candidates should APPLY NOW for immediate consideration! Please hit APPLY to provide the required information, and we will be back in touch as soon as possible Thank you! ABOUT INNOVA SOLUTIONS: Founded in 1998 and headquartered in Atlanta, Georgia, Innova Solutions employs approximately 50,000 professionals worldwide and reports an annual revenue approaching $3 Billion Through our global delivery centers across North America, Asia, and Europe, we deliver strategic technology and business transformation solutions to our clients, enabling them to operate as leaders within their fields Recent Recognitions: One of Largest IT Consulting Staffing firms in the USA Recognized as #4 by Staffing Industry Analysts (SIA 2022) ClearlyRated Client Diamond Award Winner (2020) One of the Largest Certified MBE Companies in the NMSDC Network (2022) Advanced Tier Services partner with AWS and Gold with MS
Posted 1 week ago
3.0 - 6.0 years
3 - 7 Lacs
Pune
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Middle Office - Analyst - Business Systems - LocationPune Experience3 - 6 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Capital Market knowledge, Bank Debts, Solvas Apex Group Ltd has an immediate requirement for Middle Office Tech Specialist. As an ETL Techno-Functional Support Specialist at Solvas, you will be the bridge between technical ETL processes and end-users, ensuring the effective functioning and support of data integration solutions. Your role involves addressing user queries, providing technical support for ETL-related issues, and collaborating with both technical and non-technical teams to ensure a seamless data integration environment. You will contribute to the development, maintenance, and enhancement of ETL processes for solvas application, ensuring they align with business requirements. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Functional/ Business Expertise Required Serve as the primary point of contact for end-users seeking technical assistance related to Solvas applications. Serve as a point of contact for end-users, addressing queries related to ETL processes, data transformations, and data loads. Provide clear and concise explanations to non-technical users regarding ETL functionalities and troubleshoot issues. Integrate Client Trade files into the Conversant systemdesign, develop, implement, and test technical solutions based on client and business requirements. Diagnose and troubleshoot ETL-related issues reported by end-users or identified through monitoring systems. Work closely with business analysts and end-users to understand and document ETL requirements. Monitor ETL jobs and processes to ensure optimal performance and identify potential issues. Create user documentation and guides to facilitate self-service issue resolution. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA and Advance Excel. Good understanding of Solvas or any other loan operation system . Mandatory to have good knowledge of Solvas Bank Debt working . Intermediate knowledge of financial instruments, both listed and unlisted or OTCs , which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Understanding of the Loan operation industry is necessary. Should have knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Proficiency in any loan operation system, preferably solvas. An ability to work under pressure with changing priorities. Strong analytical and problem -solving skills. Experience and Knowledge: 3+ years of related experience in support/ technical in any loan operation system & accounting system (Solvas/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Good communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
5.0 - 7.0 years
18 - 20 Lacs
Mumbai
Work from Office
Work Timing: Standard IST 6 months Contract Exp in batch/real-time integrations with ODI 11g, customizing knowledge modules & design/development expertise Skilled in ETL processes, PL/SQL & building Interfaces, Packages, Load Plans & Sequences in ODI Required Candidate profile Exp in ODI Master & Work Repository, data modeling & ETL design, multi-system integration, error handling, automation & object migration in ODI Performance tuning, unit testing & debugging mappings Perks and benefits MNC
Posted 1 week ago
7.0 - 12.0 years
25 - 27 Lacs
Kolkata, Bengaluru
Hybrid
Required Experience: • Design, develop, and maintain ETL/ELT workflows using Informatica IICS. • Collaborate with business and technical teams to understand requirements and translate them into robust data integration solutions. • Optimize data pipelines for performance and scalability. • Integrate IICS solutions with cloud-based data stores like Google BigQuery and cloud storage solutions. • Develop data mappings, task flows, parameter files, and reusable objects. • Manage deployments, migrations, and version control for IICS assets. • Perform unit testing, debugging, and troubleshooting of ETL jobs. • Document data flow and architecture as part of the SDLC. • Work in an Agile environment and participate in sprint planning, reviews, and retrospectives. • Provide mentorship and code reviews for junior developers, ensuring adherence to best practices and coding standards. Skills & Qualifications: • Bachelors or Master’s degree in Computer Science, Information Systems, or related field. • 7+ years of experience in ETL development with at least 2–3 years in Informatica IICS. • Strong experience in data integration, transformation, and orchestration using IICS. • Good working knowledge of cloud data platforms, preferably Google Cloud Platform (GCP). • Hands-on experience with Google BigQuery (GBQ) including writing SQL queries, data ingestion, and optimization. • Strong SQL skills and experience with RDBMS (e.g., Oracle, SQL Server, PostgreSQL). • Experience in integrating data from various sources including on-prem, SaaS applications, and cloud data lakes. • Familiarity with data governance, data quality, and data cataloging tools. • Understanding of REST APIs and experience with API integration in IICS. • Excellent problem-solving skills and attention to detail. • Strong communication skills and the ability to work effectively in a team.
Posted 1 week ago
6.0 - 8.0 years
15 - 18 Lacs
Bengaluru
Work from Office
Role Overview For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Key Responsibilities Data Collection and Management: Extract, transform, and load (ETL) data from various sources into our online applications using tools and processes. Utilize APIs to automate data integration from diverse platforms. Maintain and enhance existing data pipelines, ensuring data integrity and consistency. Data Analysis: Conduct in-depth data analysis to uncover trends, patterns, and actionable insights. Utilize SQL for querying, managing, and manipulating large datasets. Create and maintain interactive dashboards and reports to present data insights to stakeholders. Monitoring and Alerts: Implement and manage Azure monitoring and alerting systems to ensure data workflows and applications are functioning optimally. Proactively identify and troubleshoot issues in data processes, ensuring minimal downtime and maximum reliability. Collaboration and Communication: Collaborate with cross-functional teams including marketing, product development, and IT to understand data needs and provide analytical support. Communicate complex data findings and recommendations to both technical and non-technical audiences. Contribute to continuous improvement of data processes, analytical methodologies, and best practices. Critical Competencies for Success Educational Background: Bachelors degree in Data Science, Statistics, Computer Science, or a related field. Technical Skills: Proficient in SQL for data querying and manipulation. Experience with ETL processes and tools. Strong understanding of API integration and data automation. Hands-on experience with Azure monitoring and alerting tools. Knowledge of programming languages such as HTML and Javascript is a plus. Experience: Proven experience in a data analyst role or similar position. Demonstrated experience with online data sources and web analytics. Experience with cloud platforms, particularly Azure, is required. Analytical Skills: Strong problem-solving skills and attention to detail. Ability to analyze large datasets and generate meaningful insights. Excellent statistical and analytical capabilities. Soft Skills: Strong communication and presentation skills. Ability to work independently and collaboratively in a team environment. Good organizational skills and ability to manage multiple projects simultaneously. Role & responsibilities For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Preferred candidate profile Bachelors degree in Data Science, Statistics, Computer Science, or a related field Perks and benefits Industrial standards
Posted 1 week ago
3.0 - 6.0 years
7 - 15 Lacs
Gurugram
Work from Office
Dear Candidate, Greetings!! Hiring For SSIS Developer - Gurgaon(wfo) Responsibilities 1 Must have exp into SSIS packages for ETL processes 2 End to end data migration 3 Must have exp in Oracle Cloud Share resume on abhishek@xinoe.com Regards,
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education High School Diploma/GED Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane