Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a skilled PL/SQL and MySQL Lead Developer looking to join a dynamic team at an Enterprise IT Service Management company based in Basavanagudi, Bangalore. In this role, you will be responsible for designing, developing, and maintaining complex database systems using PL/SQL and MySQL. Your main focus will be on ensuring the efficient performance of databases and providing support for database-related issues. Your key responsibilities will include database development tasks such as designing, developing, and maintaining PL/SQL packages, procedures, functions, and triggers. You will also be writing and optimizing MySQL queries for applications and developing and maintaining complex stored procedures and functions. It will be essential for you to ensure that the database code meets organizational standards and best practices for performance, security, and maintainability. In terms of database management, you will be required to perform tasks such as database tuning and performance monitoring, implementing data migration and ETL processes between different systems, conducting regular backups and data recovery exercises, and monitoring database systems to ensure secure services with minimal downtime. Collaboration will be key in this role as you will work closely with application developers to ensure database integration with applications. You will collaborate with IT and development teams to resolve complex issues and provide database support to address issues related to database design, functionality, and performance. Documentation will also be a part of your responsibilities where you will create and maintain documentation of database designs, configurations, and user instructions. Additionally, you will document and enforce database standards and procedures to ensure consistency and efficiency across the organization. To be successful in this role, you should have proven experience as a PL/SQL Developer and MySQL Developer, along with experience in database design, optimization, and performance tuning. Proficiency in PL/SQL development, including writing complex stored procedures, functions, and triggers, as well as strong experience with MySQL, including query optimization and database performance tuning, are required. Knowledge of data warehousing and ETL processes, familiarity with other databases like Oracle or SQL Server, experience with database design and modeling tools, and an understanding of indexing, partitioning, and other database optimization techniques will be advantageous. In terms of qualifications, a Bachelor's degree in Computer Science, Information Technology, or a related field is required, although equivalent experience may be considered. Preference will be given to candidates based in Bangalore. If you are looking to join a team where you can contribute your expertise in PL/SQL and MySQL development, collaborate with IT and development teams, and ensure the efficient performance of complex database systems, then this role as a PL/SQL Lead Developer at the IT Service Management Company in Basavanagudi, Bangalore, may be the perfect fit for you. Contact Rajina Sathiraj at +91 8904881858 or email at rajina@sapienceminds.com for further details.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this role will be responsible for working independently within the team on various technical topics. You will be required to support the decomposition of customer requests into detailed stories by collaborating with the Product Owner. Your main tasks will involve developing code using Software Craftsmanship best practices like behavior-driven development, test-driven development, continuous integration, legacy refactoring, continuous delivery, and continuous deployment. You should have a strong background in Talend data integration and MicroStrategy Analytics with at least 3-7 years of experience. Proficiency in SQL Queries and RDBMS performance tuning for Oracle and PostgreSQL is essential. A deep understanding of ETL processes and the ability to design, define, and document solution architecture are key requirements for this role. It is advantageous to have awareness of Cognos 11.x or higher. Additionally, experience in gathering client requirements, working through specifications, and developing solutions in line with project documentation is crucial. The ability to manage large projects independently, knowledge of Business Analysis (particularly in Retail and Private banking), and hands-on experience in agile processes are highly desirable. Joining Societe Generale will provide you with the opportunity to be part of a team that believes in the transformative power of people. Whether you are looking for a short-term commitment or a long-term career, you can contribute to shaping the future. Our culture encourages creativity, innovation, and action, offering a supportive and stimulating environment for your professional growth. As part of our commitment to social responsibility, employees have the opportunity to dedicate working hours to solidarity actions. This includes sponsoring individuals facing challenges in their professional journey, contributing to the financial education of young apprentices, and sharing skills with charitable organizations. If you are passionate about making a difference and want to work in a dynamic and inclusive environment, this role is the perfect fit for you.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
As a Snowflake DBT Lead at Pyramid Consulting, you will be responsible for overseeing Snowflake data transformation and validation processes in Delhi, India. Your role will include ensuring efficient data handling, maintaining data quality, and collaborating closely with cross-functional teams. To excel in this role, you should have strong expertise in Snowflake, DBT, and SQL. Your experience in data transformation, modeling, and validation will be crucial for success. Proficiency in ETL processes and data warehousing is essential to meet the job requirements. Your excellent problem-solving and communication skills will enable you to effectively address challenges and work seamlessly with team members. As a candidate for this position, you should hold a Bachelor's degree in Computer Science or a related field. Your ability to lead and collaborate within a team environment will be key to delivering high-quality solutions and driving impactful results for our clients.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You will be working as an Azure Data Engineer with Zorba AI, a modern analytics provider based in Mumbai. Your primary role will involve developing and maintaining data pipelines, optimizing data structures, and ensuring data security. This contract role is fully remote, allowing you to collaborate with cross-functional teams to implement Azure-based data solutions. To excel in this role, you should possess proficiency in Azure cloud services and tools, along with experience in data engineering, ETL processes, and data warehousing. Strong knowledge of SQL, Python, and data modeling is essential, as well as the ability to optimize data pipelines for performance and scalability. Problem-solving skills, analytical abilities, effective communication, and teamwork are also crucial for success in this position. The ideal candidate will have a Bachelor's degree in Computer Science or a related field, along with at least 4 years of experience in Azure data engineering. If you are passionate about leveraging data-driven solutions to drive growth and success for businesses, this role at Zorba AI could be the perfect fit for you.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Software Engineer at Clarivate, you will be an integral part of our Technology team, focusing on data engineering, ETL, and script writing to create efficient data pipelines for data cleansing and structuring processes. To excel in this role, you should hold a Bachelor's degree in computer science or possess equivalent experience. Additionally, a minimum of 3 years of Programming Experience with a strong grasp of SQL is required. Experience with ETL processes, APIs, data integration, system analysis, and design is highly valuable. You should be adept at implementing data validation and cleansing processes to maintain data integrity and proficient in pattern matching, regular expressions, XML, JSON, and other textual formats. In this position, you will be analyzing textual and binary patent data, utilizing regular expressions to extract data patterns. Writing clean, efficient, and maintainable code according to coding standards, automating tests, and unit testing all assigned tasks are key responsibilities. You will collaborate closely with Content Analysts team to design and implement mapping rules for data extraction from various file formats. Furthermore, you will liaise with cross-functional teams to understand data requirements and provide technical support. Ideally, you would have experience with cloud-based data storage and processing solutions like AWS, Azure, Google Cloud, and a strong understanding of code versioning tools such as Git. At Clarivate, you will be part of the Data Engineer team, collaborating with multiple squads comprising Data Engineers, Testers, Leads, and Release Managers to process and deliver high-quality patent data from diverse input source formats. This permanent position at Clarivate offers a hybrid work model with 9 hours of work per day, including a lunch break, providing a flexible and employee-friendly work environment. Clarivate is dedicated to promoting equal employment opportunities for all individuals in terms of hiring, compensation, promotion, training, and other employment privileges. We adhere to applicable laws and regulations to ensure non-discrimination in all locations.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As an experienced Oracle FCCS (Financial Consolidation and Close Cloud Service) Implementation Consultant, you will be responsible for leading the design, deployment, and optimization of Oracle FCCS solutions for financial consolidation, intercompany eliminations, currency translation, and financial close processes. Your expertise in consolidation accounting, statutory reporting, GAAP/IFRS compliance, financial close automation, and data integration with ERP systems will be crucial in ensuring the smooth consolidation and reporting cycles. Your key responsibilities will include: - Leading end-to-end implementation of Oracle FCCS for financial consolidation and close processes. - Configuring FCCS dimensions, metadata, security, and consolidation rules based on business requirements. - Developing intercompany elimination rules, ownership structures, and multi-currency translation logic. - Customizing forms, dashboards, task lists, and Smart View reports for financial users. - Working closely with finance and accounting teams to optimize month-end and quarter-end close cycles. - Ensuring GAAP, IFRS, and statutory compliance in financial reporting and consolidation. - Configuring Data Management (DM/FDMEE) for data integration from ERP systems (Oracle Cloud, SAP, Workday, etc.). - Developing and optimizing business rules, calculation scripts, and Groovy scripts for complex consolidation logic. - Conducting end-user training sessions for finance, accounting, and audit teams. - Collaborating with cross-functional teams to integrate FCCS with other EPM applications (EPBCS, ARCS, EDMCS). To be successful in this role, you should have a Bachelor's degree in Finance, Accounting, Business, Information Systems, or a related field, along with 3 to 6 years of hands-on experience in Oracle FCCS implementation and consolidation accounting. Additionally, possessing Oracle FCCS Certification, CPA, CA, or equivalent accounting certification would be advantageous. Your technical skills should include proficiency in Smart View, Data Management (DM/FDMEE), and Essbase cube optimization, as well as experience with REST/SOAP APIs, SQL, and ETL tools for data integration. Strong communication, problem-solving, and stakeholder management skills are essential for effective collaboration with finance and IT teams. If you are self-motivated, able to manage multiple projects in a fast-paced environment, and have exposure to project management methodologies (Agile, Scrum, or Waterfall), we encourage you to join our team. Your contributions will play a key role in delivering innovative Oracle solutions that maximize operational excellence and benefits for our clients.,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
Optum is a global organization dedicated to delivering care using technology to improve the lives of millions of people. Your work with our team will directly enhance health outcomes by providing individuals with access to care, pharmacy benefits, data, and resources necessary for their well-being. Our culture is defined by diversity and inclusion, alongside talented colleagues, comprehensive benefits, and opportunities for career development. Join us in making a positive impact on the communities we serve while contributing to the advancement of global health equity through caring, connecting, and growing together. In this role, your primary responsibilities will include analyzing client requirements and complex business scenarios, designing innovative and fully automated products and solutions, serving as a BI Developer for key projects, ensuring high-quality execution of products, providing consulting to teammates, leaders, and clients, and offering extensive solutions in ETL strategies. You should possess an undergraduate degree or equivalent experience, along with expertise in ETL processes and data integration using Azure Data Factory. Proficiency in Power BI semantic model creation, report development, and data visualization is required, with Snowflake and Azure Data Warehouse as primary data sources. Additionally, you should have a strong understanding of data modeling concepts, relational database systems, Snowflake, and Azure Data Warehouse. Familiarity with Databricks for data engineering, advanced analytics, and machine learning tasks is preferred, as well as proficiency in Azure Cloud services such as Azure Data Factory, Azure SQL Data Warehouse, Azure Data Lake Storage, and Azure Analytics. Solid programming skills in Python, SQL, and other scripting languages are essential, along with proven problem-solving abilities, effective communication and collaboration skills, and the capacity to manage multiple tasks simultaneously. Microsoft certifications in Power BI, Azure Cloud, Snowflake, or related fields are a plus. The role is based in Hyderabad, Telangana, IN.,
Posted 2 months ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
As a Principal Software Engineer at Autodesk within the Growth Experience Technology (GET) organization, you will be part of the Enterprise System and Application Engineering team. This team is structured by product domain to ensure alignment with team members and drive growth, platform evolution, efficiency, and scalability. Your responsibilities will include collaborating with stakeholders to gather and analyze complex requirements, providing expert guidance on Salesforce solutions, and aligning them with strategic goals. You will lead design and architecture efforts, write advanced code using Salesforce technologies, and ensure code quality and maintainability. Additionally, you will lead customization and integration projects, develop data solutions, implement security measures, define testing strategies, and oversee deployments. To be successful in this role, you will need a Bachelor's degree in computer science or a related field, Salesforce Developer certification, and at least 9 years of experience as a Salesforce Developer. Proficiency in Apex, Visualforce, Lightning Components, and declarative development is required. Experience with Sales Cloud, Sales processes, integrations, and data management is essential. As a Principal Software Engineer, you will contribute to strategic planning for Salesforce programs, collaborate with global teams, coach junior developers, lead training sessions, and manage relationships with third-party vendors. Your role will also involve ensuring legal and regulatory compliance, optimizing performance, and staying updated on Salesforce technology trends. Autodesk values diversity and belonging, offering a competitive compensation package based on experience and location. Join Autodesk to be part of a culture that encourages authenticity, meaningful work, and building a better future for all.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As an Azure Data Factory Engineer at Aspire Systems, you will be responsible for designing, developing, and maintaining robust data pipelines and ETL processes. Your role will involve implementing and optimizing data storage solutions in data warehouses and data lakes. You should have strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake, coupled with excellent communication skills. Key Responsibilities: - Design, develop, and maintain robust data pipelines and ETL processes. - Implement and optimize data storage solutions in data warehouses and data lakes. - Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions. - Utilize Microsoft Azure tools for data integration, transformation, and analysis. - Develop and maintain reports and dashboards using Power BI and other analytics tools. - Ensure data integrity, consistency, and security across all data systems. - Optimize database and query performance to support data-driven decision-making. Qualifications: - 7-10 years of professional experience in data engineering or a related field. - Profound expertise in SQL, T-SQL, database design, and data warehousing principles. - Strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. - Proficiency in Python, PySpark, and PySQL for data processing and analytics tasks. - Experience with Power BI and other reporting and analytics tools. - Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. - Excellent problem-solving, analytical, and communication skills. Join Aspire Systems, a global technology services firm that serves as a trusted technology partner for over 275 customers worldwide. Aspire collaborates with leading enterprises in Banking, Insurance, Retail, and ISVs, helping them leverage technology to thrive in the digital era. With a focus on Software Engineering & Digital Technologies, Aspire enables companies to operate smart business models. The company's core philosophy of Attention. Always. reflects its commitment to providing exceptional care and attention to customers and employees. Aspire Systems is CMMI Level 3 certified and has a global workforce of over 4900 employees, operating across North America, LATAM, Europe, Middle East, and Asia Pacific. Aspire Systems has been consistently recognized as one of the Top 100 Best Companies to Work For by the Great Place to Work Institute for 12 consecutive years. Explore more about Aspire Systems at https://www.aspiresys.com/.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You should have over 5 years of experience in working with SQL Server Integration Services (SSIS) or SQL Server Analysis Services (SSAS) and possess strong knowledge of T-SQL and SQL Server-based development tools. Your role will involve handling ETL processes, data warehousing, and OLAP cube development. It is essential to have familiarity with data modeling and report development using tools such as SSRS and Power BI. Furthermore, you should be proficient in troubleshooting and optimizing SQL queries and SSIS packages. To be eligible for this position, you must have more than 4 years of relevant experience and possess mandatory skills in SSIS/SSAS.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
ahmedabad, gujarat
On-site
As an experienced Business Intelligence (BI) Developer at Active Bean Counter Private Limited, your primary responsibility will be to design, develop, and maintain enterprise-level BI solutions utilizing Power BI, SQL, and Azure. You will play a crucial role in integrating Salesforce data, optimizing performance, and providing actionable insights to facilitate data-driven decision-making. Your key responsibilities will include: Dashboard and Report Development: - Create advanced Power BI dashboards, DAX calculations, reports, and data models to deliver critical business insights and meet user requirements effectively. SQL and Data Optimization: - Write and optimize complex SQL queries, stored procedures, and views for data extraction and transformation. Implement performance tuning techniques to improve query efficiency. Azure Data Services: - Utilize Azure Data Services such as Azure SQL and Azure Data Factory to construct and manage cloud-based BI solutions. Automate data refresh schedules and ETL pipelines using Azure tools and Power BI Service. Salesforce Integration: - Integrate Salesforce data and reports into Power BI for comprehensive analytics. Ensure data consistency and accuracy when merging multiple data sources. Security and Governance: - Develop and implement row-level security (RLS), governance frameworks, and performance optimization in Power BI. Uphold data security standards and enforce best practices for data governance. Collaboration and Business Requirements: - Work closely with cross-functional teams to gather business requirements and translate them into scalable BI solutions. Provide technical leadership, mentorship, and guidance to junior developers and business users. Continuous Improvement: - Stay updated with the latest trends in Power BI, Azure, and analytics. Enhance reporting solutions continuously by incorporating best practices and integrating new technologies effectively. To excel in this role, you should possess: - 7+ years of experience in Power BI development or similar analytics tools. - Strong expertise in SQL development, including query optimization, stored procedures, and performance tuning. - Hands-on experience with Azure Data Services like Azure SQL and Data Factory. - Familiarity with Salesforce reporting and data integration in Power BI. - Strong knowledge of data modeling concepts such as Star Schema, Snowflake Schema, and ETL processes. - Experience in implementing security and governance in Power BI environments. - Ability to interpret business requirements and translate them into actionable insights. - Strong analytical skills with the capability to communicate complex data in a clear, business-friendly manner. Preferred qualifications include experience in mentoring and providing technical leadership to junior developers, as well as expertise in automating data pipelines and managing large datasets in Azure environments.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Inviting applications for the role of Manager - Global Event Tech Manager Location - Gurugram We welcome talent that will constantly push the boundaries for newer and better ways of doing things. The global event tech manager is a pivotal role in helping Genpact take growth to the next level through global marketing planning and execution designed to drive results. In this role, you will be responsible for the management and execution of in-person events (hosted & sponsored) as well as virtual events on different platforms. You must possess strong marketing, communications and program management skills, with the ability to work seamlessly with peers across the marketing function and collaborate with colleagues to achieve common growth ambitions. Responsibilities Event logistics Work with external vendors and suppliers to ensure timely and cost-effective event execution Initiate and oversee the Sourcing process on MSA, SOW, including PR/PO (purchase requests) Ensure brand compliance with all events and event materials Assist with event administration, invoicing, and logistical planning and management Support the acquisition process of branded merchandise through the online store and manage global orders from briefing to delivery Event Platform management Build and manage the project plan to support Program Owner priorities and objectives and being responsible for delivery management across all workstreams related to Genpacts events experience platforms. Oversee the day-to-day administration of Cvent, handling all the queries related to the platform setup and management, tracking and measurement Work on regular engagement reporting on the available dashboards and other types of reports by request Support on preparation of outcomes communications to various stakeholders through regular report outs Content management on the existing interface, including adding/removing content and imagery related to events, partnering with relevant teams and GStudios Maintaining guides and templates for customizing content and co-ordination with event team and GTM leaders on content changes in different environments Coordination on new deployments and work on process requirements Manage user access, permissions, and training to ensure optimal use of Cvent across the organization. Develop and maintain Cvent best practices, guidelines, and SOPs. Make recommendations on the user experience based on best practices and performance Salesforce & cross tech Integration: Design, develop, and maintain integrations between Cvent and Salesforce to ensure seamless data flow and synchronization. Work with IT or third-party developers to integrate Cvent with other systems (e.g., CRM, marketing tools) Troubleshoot and resolve integration issues promptly to minimize downtime and data discrepancies. Collaborate with the IT team to ensure secure and efficient data handling practices. Project Management: Lead projects related to Cvent implementations, upgrades, and integrations with Salesforce. Coordinate with cross teams to gather requirements, plan, and execute projects. Manage project timelines, resources, and deliverables, ensuring projects are completed on time and within budget. Strong organizational skills with the ability to manage multiple events simultaneously. Excellent verbal and written communication skills for coordinating with vendors, attendees, and internal teams Ability to present event plans and outcomes to stakeholders. Qualifications we seek in you! Minimum Qualifications / Skills Years of experience with Cvent and relevant event management platforms, with at least years in an administrative or managerial role. Certifications in Cvent (Cvent Certified Event Manager) and Salesforce consultant (e.g., Salesforce Certified Administrator, Platform App Builder). Proven experience with Salesforce, including integration experience, preferably with Apex, Salesforce APIs, or middleware tools. Application integration experience with years experience Strong understanding of data integration principles, ETL processes, and data warehousing. Knowledge of event management processes and best practices. Strong analytical and problem-solving skills. Excellent communication and executive presence to connect at C-level Creative, resourceful and takes initiative Strong project management skills Demonstrated ability to drive change, to effectively influence and motivate others,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),
Posted 2 months ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 2 months ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consu ltant -Data Modeler! We are seeking a meticulous and experienced Data Modeler to join our dynamic team and contribute to the enhancement of our data infrastructure. This pivotal role centers on creating detailed data models that effectively capture and organize our enterprise data in ways that support critical business functions and decision-making processes. The ideal candidate will possess a deep understanding of data architecture principles, extensive experience with data modeling tools, and a proven track record in designing scalable and efficient data structures. As a Data Modeler, you will collaborate closely with data architects and business analysts to ensure that our data strategies are optimally aligned with our business goals, driving innovations and improvements in data usability, quality, and efficiency. Roles and Responsibilities: Develop comprehensive conceptual, logical, and physical data models that meet business requirements. Work closely with stakeholders to understand business processes and gather data requirements. Implement data standards and common data structures to improve data quality and accessibility. Utilize advanced modeling tools to create and maintain data schemas and dictionaries. Ensure data models are integrated with existing data architectures and extend support to data engineering and data analytics teams. Conduct data model reviews with cross-functional team members to ensure quality and functionality. Manage metadata to ensure consistency and relevance across all data models. Optimize and update data models to accommodate new requirements and changes in business processes. Lead initiatives for data normalization, data enrichment, and other forms of data transformation to enhance data utility and efficiency. Develop best practices for data modeling and schema design, mentoring junior data modelers and promoting a culture of quality and innovation in data handling. Experience in GenAI project Qualifications we seek in you! Minimum qualifications Minimum experience in data modeling or a related field. Expertise in using modeling tools such as ERwin , Enterprise Architect, or Microsoft SQL Server Data Tools. Strong understanding of relational and non-relational database systems. Experience with data warehousing and ETL processes. Proficient in SQL and familiar with scripting languages such as Python or R. Demonstrated ability to translate business needs into data models that support enterprise information systems. Knowledge of data management regulations and compliance requirements . Ability to work effectively in a collaborative team environment. Strong analytical and problem-solving skills. Bachelor%27s Degree in Computer Science , Data Science, Information Systems, or a related field. Master&rsquos degree or professional certification in data management or architecture is preferred. Demonstrated experience in managing large datasets and complex data structures. Proven track record of successful project execution with tangible improvements in data architecture. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
0.0 years
2 - 5 Lacs
Rohtak, Haryana, India
On-site
Any candidate who wants to apply can contact on the given contact number.08375858125 Data Entry Tasks: Accurately input and update data into company databases and systems. Verify the accuracy of information and resolve any discrepancies. Organize and manage data files to ensure easy retrieval and access. Perform routine quality checks to ensure data integrity. Backend Operations: Assist in preparing reports and handling correspondence. Coordinate with various departments to ensure smooth operations. Maintain confidentiality of sensitive information. Perform other administrative tasks as assigned. Qualifications: Minimum educational qualification: 12th pass or equivalent. Basic computer proficiency (MS Office, email, etc.). Good communication skills. Attention to detail and a willingness to learn Any candidate who wants to apply can contact on the given contact number.08375858125
Posted 2 months ago
4.0 - 9.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Notice Period : Immediate - 15 Days Job Summary : As a Software Engineer / Senior Software Engineer (Database), you will play a pivotal role in designing, developing, and maintaining the database infrastructure for our core product. You will collaborate closely with the development team to ensure that our database solutions are scalable, efficient, and aligned with our business objectives. Key Responsibilities : 1. Database Design and Development : - Develop and implement database models, views, tables, stored procedures, and functions to support product development. - Design and maintain SSIS packages, T-SQL scripts, and SQL jobs. - Optimize database performance through query tuning, indexing, and partitioning. 2. Data Integration : - Develop complex stored procedures for loading data into staging tables from various sources. - Ensure data integrity and consistency across different systems. 3. Data Analytics : - Collaborate with data analysts to design and implement data analytics solutions using tools like SQL Server, SSIS, SSRS, and Excel Power Pivot/View/Map. 4. Documentation : Document complex processes, business requirements, and specifications. 5. Database Administration : - Provide authentication and authorization for database access. - Develop and enforce best practices for database design and development. - Manage database migration activities. Required Skills : Technical Skills : - Strong proficiency in MS SQL Server (query tuning, stored procedures, functions, views, triggers, indexes, column store index, SQL server column storage, query execution plan). - Experience with database design, normalization, and performance optimization. - Knowledge of data warehousing and ETL processes. - Experience with SSIS, SSRS, and Excel Power Pivot/View/Map. Soft Skills : - Excellent analytical, problem-solving, and communication skills. - Ability to work independently and as part of a team. - Attention to detail and commitment to quality.
Posted 2 months ago
7.0 - 10.0 years
3 - 15 Lacs
Chennai, Tamil Nadu, India
On-site
Your responsibilities Collaborate with business units, IT teams, and other stakeholders to understand data needs and establish governance requirements. Lead and improve data governance practices, ensuring that FLS s data is organized, governed, and used to drive impactful business transformation. Provide expert guidance on defining and implementing data standards, quality metrics, and governance frameworks. Track and report on master data governance progress, ensuring measurable outcomes and continuous improvement. Stay ahead of industry trends and best practices in master data management and data governance. Establish data policies and procedures, defining and documenting governance policies and procedures. Drive cross-functional data forums for all data domains. What you bring Strong understanding of data governance principles, best practices, and data quality management. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Ability to analyze complex data governance issues, identify root causes, and propose solutions to improve processes and mitigate risks. Attention to detail in ensuring data accuracy and compliance with governance standards. Ability to thrive in a fast-paced, dynamic environment and manage multiple priorities proactively. Experience with Microsoft Dynamics (CRM and ERP) would be an advantage. A masters degree or equivalent in IT, Data Management, Business Economics, or a related field. Fluency in English, as you will be joining an international team working across borders.
Posted 2 months ago
5.0 - 15.0 years
5 - 18 Lacs
Kolkata, West Bengal, India
On-site
Description We are seeking a skilled Data Scientist with expertise in Machine Learning to join our dynamic team in India. The ideal candidate will have a strong background in developing predictive models and a passion for extracting insights from complex data. This role will involve collaboration with various teams to drive data-driven decision-making. Responsibilities Develop and implement machine learning models to solve business problems. Analyze large datasets to extract meaningful insights and trends. Collaborate with cross-functional teams to define data-driven strategies. Communicate findings and recommendations to stakeholders effectively. Continuously improve models based on feedback and new data. Stay updated with the latest advancements in data science and machine learning. Skills and Qualifications Strong understanding of machine learning algorithms and techniques. Proficiency in programming languages such as Python, R, or Java. Experience with data manipulation and analysis libraries (e.g., Pandas, NumPy). Familiarity with machine learning frameworks (e.g., TensorFlow, Keras, Scikit-learn). Solid understanding of statistics and probability theory. Experience with data visualization tools (e.g., Matplotlib, Seaborn, Tableau). Ability to work with SQL and NoSQL databases. Strong problem-solving skills and analytical mindset.
Posted 2 months ago
5.0 - 15.0 years
5 - 20 Lacs
Gurgaon, Haryana, India
On-site
Description We are seeking a Data Analyst with 5-15 years of experience to join our team in India. The ideal candidate will have a strong background in data analysis, a passion for uncovering insights, and the ability to communicate findings effectively. Responsibilities Collecting and analyzing large sets of data to identify trends and patterns. Creating data visualizations and reports to communicate findings to stakeholders. Collaborating with cross-functional teams to understand data needs and deliver actionable insights. Developing and maintaining databases and data systems to ensure data integrity. Performing statistical analysis to support decision-making processes. Skills and Qualifications Bachelor's degree in Data Science, Statistics, Mathematics, Computer Science, or related field. Proficiency in SQL for data querying and manipulation. Experience with data visualization tools such as Tableau, Power BI, or similar. Strong analytical skills with the ability to interpret complex data sets. Familiarity with programming languages such as Python or R for data analysis. Knowledge of statistical methods and techniques for data analysis.
Posted 2 months ago
0.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant-Data Engineer, AWS! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant, Senio r Data Scientist ! In this role, you will have a strong background in Gen AI implementations, data engineering, developing ETL processes, and utilizing machine learning tools to extract insights and drive business decisions. The Data Scientist will be responsible for analysing large datasets, developing predictive models, and communicating findings to various stakeholders Responsibilities Develop and maintain machine learning models to identify patterns and trends in large datasets. Utilize Gen AI and various LLMs to design & develop production ready use cases. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Communicate complex data findings and insights to non-technical stakeholders in a clear and concise manner. Continuously monitor and improve the performance of existing models and processes. Stay up to date with industry trends and advancements in data science and machine learning. Design and implement data models and ETL processes to extract, transform, and load data from various sources. Good hands own experience in AWS bedrock models, Sage maker, Lamda etc Data Exploration & Preparation - Conduct exploratory data analysis and clean large datasets for modeling. Business Strategy & Decision Making - Translate data insights into actionable business strategies. Mentor Junior Data Scientists - Provide guidance and expertise to junior team members. Collaborate with Cross-Functional Teams - Work with engineers, product managers, and stakeholders to align data solutions with business goals. Qualifications we seek in you! Minimum Qualifications Bachelor%27s / Master%27s degree in computer science , Statistics, Mathematics, or a related field. Relevant years of experience in a data science or analytics role. Strong proficiency in SQL and experience with data warehousing and ETL processes. Experience with programming languages such as Python & R is a must . (either one ) Familiarity with machine learning tools and libraries such as Pandas, scikit-learn and AI libraries. Having excellent knowledge in Gen AI, RAG, LLM Models & strong understanding of prompt engineering. Proficiency in Az Open AI & AWS Sagemaker implementation. Good understanding statistical techniques such and advanced machine learning Experience with data warehousing and ETL processes. Proficiency in SQL and database management. Familiarity with cloud-based data platforms such as AWS, Azure, or Google Cloud. Experience with Azure ML Studio is desirable. Knowledge of different machine learning algorithms and their applications. Familiarity with data preprocessing and feature engineering techniques. Preferred Qualifications/ Skills Experience with model evaluation and performance metrics. Understanding of deep learning and neural networks is a plus. Certified in AWS Machine learning , AWS Infra engineer is a plus Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant, Senio r Data Scientist ! In this role, you will have a strong background in Gen AI implementations, data engineering, developing ETL processes, and utilizing machine learning tools to extract insights and drive business decisions. The Data Scientist will be responsible for analysing large datasets, developing predictive models, and communicating findings to various stakeholders Responsibilities Develop and maintain machine learning models to identify patterns and trends in large datasets. Utilize Gen AI and various LLMs to design & develop production ready use cases. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Communicate complex data findings and insights to non-technical stakeholders in a clear and concise manner. Continuously monitor and improve the performance of existing models and processes. Stay up to date with industry trends and advancements in data science and machine learning. Design and implement data models and ETL processes to extract, transform, and load data from various sources. Good hands own experience in AWS bedrock models, Sage maker, Lamda etc Data Exploration & Preparation - Conduct exploratory data analysis and clean large datasets for modeling. Business Strategy & Decision Making - Translate data insights into actionable business strategies. Mentor Junior Data Scientists - Provide guidance and expertise to junior team members. Collaborate with Cross-Functional Teams - Work with engineers, product managers, and stakeholders to align data solutions with business goals. Qualifications we seek in you! Minimum Qualifications Bachelor%27s / Master%27s degree in computer science, Statistics, Mathematics, or a related field. Relevant years of experience in a data science or analytics role. Strong proficiency in SQL and experience with data warehousing and ETL processes. Experience with programming languages such as Python & R is a must. (either one ) Familiarity with machine learning tools and libraries such as Pandas, scikit-learn and AI libraries. Having excellent knowledge in Gen AI, RAG, LLM Models & strong understanding of prompt engineering. Proficiency in Az Open AI & AWS Sagemaker implementation. Good understanding statistical techniques such and advanced machine learning Experience with data warehousing and ETL processes. Proficiency in SQL and database management. Familiarity with cloud-based data platforms such as AWS, Azure, or Google Cloud. Experience with Azure ML Studio is desirable. Knowledge of different machine learning algorithms and their applications. Familiarity with data preprocessing and feature engineering techniques. Preferred Qualifications/ Skills Experience with model evaluation and performance metrics. Understanding of deep learning and neural networks is a plus. Certified in AWS Machine learning , AWS Infra engineer is a plus Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
6.0 - 9.0 years
5 - 10 Lacs
Bengaluru, Karnataka, India
On-site
We are seeking a skilled Data Engineer to join our team in India. The ideal candidate will be responsible for designing, building, and maintaining data pipelines that support our data-driven decision-making processes. You will work closely with cross-functional teams to ensure that data is accessible, reliable, and actionable. Responsibilities Designing and implementing data pipelines to collect, store, and process large datasets. Developing and maintaining scalable ETL processes for data ingestion and transformation. Collaborating with data scientists and analysts to understand data requirements and provide access to relevant data. Ensuring data quality and integrity through automated testing and monitoring. Optimizing database performance and query efficiency. Documenting data architecture and processes for team knowledge sharing. Skills and Qualifications Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong knowledge of data warehousing concepts and experience with data warehousing solutions (e.g., Snowflake, Amazon Redshift). Experience with big data technologies (e.g., Hadoop, Spark) and distributed computing frameworks. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Programming skills in Python, Java, or Scala for data processing and automation. Understanding of data modeling and schema design principles. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
Posted 2 months ago
6.0 - 9.0 years
5 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled Data Engineer to join our team in India. The ideal candidate will be responsible for designing, building, and maintaining data pipelines that support our data-driven decision-making processes. You will work closely with cross-functional teams to ensure that data is accessible, reliable, and actionable. Responsibilities Designing and implementing data pipelines to collect, store, and process large datasets. Developing and maintaining scalable ETL processes for data ingestion and transformation. Collaborating with data scientists and analysts to understand data requirements and provide access to relevant data. Ensuring data quality and integrity through automated testing and monitoring. Optimizing database performance and query efficiency. Documenting data architecture and processes for team knowledge sharing. Skills and Qualifications Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong knowledge of data warehousing concepts and experience with data warehousing solutions (e.g., Snowflake, Amazon Redshift). Experience with big data technologies (e.g., Hadoop, Spark) and distributed computing frameworks. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Programming skills in Python, Java, or Scala for data processing and automation. Understanding of data modeling and schema design principles. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |