Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
uttar pradesh
On-site
As the Senior Pega Manager, you will be responsible for leading and managing Pega development teams to design, develop, and implement Pega-based applications. Your role will involve providing technical leadership and guidance on Pega-related projects, ensuring alignment with business requirements and industry best practices. You will oversee the development of Pega applications, including case management, decision management, and robotics automation. Collaboration with stakeholders to define project scope, timelines, and resource allocation will be a crucial aspect of your responsibilities. You will be accountable for ensuring high-quality delivery of Pega projects that meet business requirements and timelines. Developing and implementing Pega-related strategies, standards, and best practices will also be part of your role. Furthermore, you will be expected to provide coaching and mentoring to team members to enhance their Pega skills and knowledge. Managing project budgets, resource allocation, and vendor relationships will be essential to ensure successful project outcomes. Compliance with Pega governance and regulatory requirements will also be a key focus area. In terms of technical skills, you should have proficiency in Pega PRPC (7.x, 8.x) and Pega Cloud, as well as expertise in Pega Case Management, Decision Management, and Robotics Automation. A strong understanding of Pega architecture, rules, and data modeling is required, along with experience in Pega implementation methodologies such as the Pega Agile Delivery Methodology. Knowledge of integration technologies like APIs and web services will also be beneficial. On the soft skills front, strong leadership, management, and communication skills are essential for this role. You should be able to collaborate effectively with cross-functional teams and stakeholders. Excellent problem-solving, analytical, and decision-making skills are crucial, along with strong project management abilities, including budgeting and resource allocation. The ideal candidate will have a minimum of 10 years of experience in Pega development, with at least 5 years in lead or managerial roles. Proven experience in leading Pega projects and teams, as well as a strong understanding of Pega best practices and industry trends, will be advantageous. A Bachelor's or Master's degree in Computer Science, Information Technology, or related fields is required. Additionally, holding certifications such as Pega Certified Senior System Architect (PCSSA) or higher would be desirable for this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Team, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products with a focus on security, stability, and scalability. Your responsibilities include executing creative software solutions, designing, developing, and troubleshooting technical issues with an innovative mindset. You are expected to develop high-quality, secure production code, review and debug code by team members, and identify opportunities to automate remediation processes for enhanced operational stability. In this role, you will lead evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical applicability within existing systems. Additionally, you will drive awareness and adoption of new technologies within Software Engineering communities, contributing to a diverse, inclusive, and respectful team culture. To excel in this position, you should possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Strong proficiency in database systems, including SQL & NoSQL, and programming languages like Python, Java, or Scala is essential. Experience in data architecture, data modeling, data warehousing, and data lakes, as well as implementing complex ETL transformations on big data platforms, will be beneficial. Proficiency in the Software Development Life Cycle and agile methodologies such as CI/CD, Application Resiliency, and Security is required. An ideal candidate will have hands-on experience with software applications and technical processes within a specific discipline (e.g., cloud, artificial intelligence, machine learning) and a background in the financial services industry. Practical experience in cloud-native technologies is highly desirable. Additional qualifications such as Java and data programming experience are considered a plus for this role.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
4CRisk is an AI start-up uniquely positioned to identify and solve the annual $300 billion Risk and Compliance problem for banks, non-bank financial institutions, and FinTech companies. The company's mission is to help customers protect brand value and strengthen revenues by reducing risk and the cost of compliance. At 4CRisk, technology, data, UI, and products have all been conceived with a customer-centric approach, believing that Culture trumps aptitude. Our Engineering center (4CRisk.ai Software Private Ltd.) in Bangalore, India is seeking bright and passionate candidates who share our vision and wish to be part of a team of talented professionals. We are looking for a Data Quality Analyst to utilize regulatory data to drive product decisions. Collaborating with cross-functional teams comprising product managers, designers, and engineers, you will apply your expertise to deliver customer insights and help shape the products we offer. Leveraging rich user data through cutting-edge technology, you will witness your insights being transformed into real products. Key Responsibilities: - Performing statistical tests on large datasets to determine data quality and integrity. - Evaluating system performance and design and its impact on data quality. - Collaborating with AI and Data Engineers to enhance data collection and storage processes. - Running data queries to identify quality issues, data exceptions, and cleaning data. - Gathering data from primary or secondary sources to identify and interpret trends. - Reporting data analysis findings to management for informed business decisions and prioritizing information system needs. - Documenting processes and maintaining data records. - Adhering to best practices in data analysis and collection. - Staying updated on developments and trends in data quality analysis. Required Experience/Skills: - Data Quality analysis experience is a must, including root-cause analysis and data slicing. - Designing, building, and executing data quality plans for complex data management solutions on modern data processing frameworks. - Understanding data lineage and preparing validation cases to verify data at each stage of the data processing journey. - Planning, designing, and conducting validations of data-related implementations to achieve acceptable results. - Developing dataset creation scripts for data verification during extraction, transformation, and loading phases by validating data mapping and transformation rules. - Supporting AI and Product Management teams by contributing to the development of a data validation strategy focusing on building the regression suite. - Documenting issues and collaborating with data engineers to resolve issues and ensure quality standards. - Efficiently capturing business requirements and translating them into functional, non-functional, and semantic specifications. - Data Profiling, Data Modeling, and Data Validation Testing experience is a plus. - 1 to 3+ years of proven experience. - Excellent presentation, communication (oral and written) in English, and relationship-building skills across all management levels and customer interactions. - Ability to collaborate with team members globally and across departments. Location: Bangalore, India.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The global startup Analytics and performance management team is seeking a motivated Analytics Specialist- Procurement and IT to fulfill the reporting and analytics requirements of Procurement and IT business functions. As an Analytics Specialist, your primary responsibility will involve creating Power BI reports and data models to meet the analytics needs of the respective business functions. You will be expected to design intuitive dashboards with compelling data storytelling elements and generate reports that enable users to easily comprehend and retain information. Furthermore, you will play a crucial role in suggesting effective UI/UX designs and supporting the team in developing Advanced Power BI reports essential for performance management. This will involve utilizing DAX and Power Query when necessary to ensure the creation of standard reporting solutions. Your proficiency in acquiring data from various sources such as flat files and XLSX, coupled with data transformation skills using Python, will aid the team in developing efficient reporting solutions. In terms of qualifications, the ideal candidate should hold a Bachelor's or Master's degree in computer science or a related field. Additionally, a minimum of 6 years of professional experience is required, with at least 5 years of expertise in Power BI dashboard creation and data visualization, along with a minimum of 2 years of experience in Python. Key Knowledge/Skills: - Proficiency in DAX, Power Query, and data modeling within Power BI - Familiarity with data visualization best practices and principles of user experience design - Working knowledge of UI/UX design to implement interactive dashboards - Ability to craft engaging narratives through visualization capabilities - Strong understanding of Procurement and IT processes - Intermediate skills in Python for data manipulation and analysis Preferred Qualifications: - Excellent communication skills, capable of presenting complex data insights to non-technical stakeholders effectively - Ability to manage, influence, negotiate, and communicate with internal business partners to address organizational capacity requirements,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architect with over 4 years of experience, you will be responsible for designing and optimizing data pipelines to integrate various data sources in order to support business intelligence and advanced analytics. Your role will involve developing data models and flows to enable personalized customer experiences and support omnichannel marketing and customer engagement. You will lead efforts to ensure data governance, data quality, and data security, ensuring compliance with regulations such as GDPR and CCPA. Additionally, you will implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs. Your responsibilities will also include optimizing workflows to streamline data transformation and modeling processes. You will leverage Azure for cloud infrastructure, data storage, and real-time data analytics, ensuring that the architecture supports scalability and performance. Collaboration with cross-functional teams, including data engineers, analysts, and business stakeholders, will be essential to ensure that data architectures meet business needs. Supporting both real-time and batch data integration will be crucial to make data accessible for actionable insights and decision-making. It will also be your responsibility to continuously assess and integrate new data technologies and methodologies to enhance the organization's data capabilities. To qualify for this role, you should have at least 4 years of experience in Data Architecture or Data Engineering, with expertise in Snowflake and Azure. A strong understanding of data modeling, ETL/ELT processes, and modern data architecture frameworks is required. Experience in designing scalable data architectures for personalization and customer analytics across marketing, sales, and customer service domains is essential. You should also have expertise with cloud data platforms (preferably Azure) and Big Data technologies for large-scale data processing. Hands-on experience with Python for data engineering tasks and scripting is also preferred. Primary Skills: - Around Relevant 3+ years of Hands-on experience in DBT, Snowflake, CICD, Python (Nice to have), SQL - Taking ownership of tasks - Eager to learn, good communication skills, enthusiastic to upskill If you are a motivated and experienced Data Architect looking to work on challenging projects in a dynamic environment, we encourage you to apply for this position.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Tech Lead at Carelon Global Solutions India, your primary responsibility will be to define solution architecture for applications for an OBA in alignment with enterprise standards and policies. You will serve as a technical subject matter expert for multiple technologies, ensuring adherence to code standards and policies while supporting various application development projects. The ideal candidate for this role should have a BE/MCA qualification with over 10 years of IT experience, including at least 5 years of in-depth knowledge of Elevance Health applications/platforms such as WGS, Facets, SPS, data platforms, Member/Provider communications (Sydney/Solution central), Carelon services (Carelon BH/Carelon RX, etc.). You should possess a good understanding of ETL tools, database concepts, data modeling, ETL best practices, multi-cloud environments (AWS, Azure, GCP), data security protocols, ERP/CRM tools, and integration technologies such as API management, SOA, Microservices, and Kafka topics. Knowledge of EA architecture guidelines and principles will be beneficial for this role. At Carelon Global Solutions, we believe in offering limitless opportunities to our associates, fostering an environment that promotes growth, well-being, and a sense of purpose and belonging. Our focus on learning and development, innovative culture, comprehensive rewards, competitive health insurance, and employee-centric policies make Life @ Carelon enriching and fulfilling. We are an equal opportunity employer committed to diversity and inclusion, and we provide reasonable accommodations to ensure a supportive work environment for all. If you require assistance due to a disability, please request the Reasonable Accommodation Request Form. Join us on this exciting journey at Carelon Global Solutions and be a part of our mission to simplify healthcare and improve lives and communities.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The Senior Semantic Modeler will be responsible for designing, developing, and maintaining Semantic models using platforms like CubeDev, HoneyDew, AtScale, and others. This role requires a deep understanding of Semantic modeling principles and practices. You will work closely with data architects, data engineers, and business stakeholders to ensure the accurate and efficient representation of data for Generative AI and Business Intelligence purposes. Experience with graph-based semantic models is a plus. As a Product Architect - Semantic Modelling, your key responsibilities will include: - Designing and developing Semantic data models using platforms such as CubeDev, HoneyDew, AtScale, etc. - Creating and maintaining Semantic layers that accurately represent business concepts and support complex querying and reporting. - Collaborating with stakeholders to understand data requirements and translating them into semantic models. - Integrating semantic models with existing Gen AI & BI infrastructure alongside data architects and engineers. - Ensuring the alignment of semantic models with business needs and data governance policies. - Defining key business metrics within the semantic models for consistent and accurate reporting. - Identifying and documenting metric definitions in collaboration with business stakeholders. - Implementing processes for metric validation and verification to ensure accuracy and reliability. - Monitoring and maintaining the performance of metrics within the Semantic models and addressing any issues promptly. - Developing efficient queries and scripts for data retrieval and analysis. - Conducting regular reviews and updates of semantic models to ensure their effectiveness. - Providing guidance and expertise on Semantic technologies and best practices to the development team. - Performing data quality assessments and implementing improvements for data integrity and consistency. - Staying up to date with the latest trends in Semantic technologies and incorporating relevant innovations into the modeling process. - Secondary responsibilities may include designing and developing graph-based semantic models using RDF, OWL, and other semantic web standards. - Creating and maintaining ontologies that accurately represent domain knowledge and business concepts. Requirements: - Bachelor's or Masters degree in Computer Science, Information Systems, Data Science, or a related field. - Minimum of 6+ years of experience in Semantic modeling, data modeling, or related roles. - Proficiency in Semantic modeling platforms such as CubeDev, HoneyDew, AtScale, etc. - Strong understanding of data integration and ETL processes. - Familiarity with data governance and data quality principles. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Experience with graph-based semantic modeling tools such as Protg, Jena, or similar is a plus. Functional skills: - Experience in Lifesciences commercial analytics industry is preferred with familiarity in industry-specific data standards. - Knowledge of Gen AI overview and frameworks would be a plus. - Certification in BI semantic modeling or related technologies. Trinity is a life science consulting firm, founded in 1996, committed to providing evidence-based solutions for life science corporations globally. With over 25 years of experience, Trinity is dedicated to solving clients" most challenging problems through exceptional service, powerful tools, and data-driven insights. Trinity has 12 offices globally, serving 270+ life sciences customers with 1200+ employees. The India office was established in 2017 and currently has around 350+ employees, with plans for exponential growth. Qualifications: B.E Graduates are preferred.,
Posted 1 week ago
4.0 - 12.0 years
0 Lacs
karnataka
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role - Implement and optimize SAP CAR functionalities for retail clients. - Develop analytical solutions for demand forecasting and inventory management. - Integrate CAR with POS and other SAP modules. - Provide ongoing support and enhancements. Your Profile - 4-12 years of SAP CAR and retail industry experience. - Knowledge of POS integration, analytics, and reporting. - Expertise in SAP HANA & data modeling. - Excellent communication and teamwork skills. What Will You Love Working At Capgemini - Retail innovation projects. - Collaborative work environment. - Growth and learning opportunities. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 week ago
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Please carefully review the position requirements before submitting a potential candidate for consideration. Role Summary: Qlik sense developer should build the dashboard/report by consuming the data from snowflake. A better visualization should be given to end users for a better understanding of data and to increase product adoptability. Roles and Responsibility: Requirement gathering knowledge Managing all the Qlik-related activities. Manage and follow up on all the Qlik-related Access Requests. Managing and operate the Service now tickets (Incidents, Requests) Support to Qlik development and N printing Development. Manage all the Qlik Production Movements Monitoring the Daily Qlik Jobs. N printing reports. Support to Month Start Activities. Giving Qlik training to Qlik End users. Supporting Governing activities Every Month. Preparing the Qlik-related support documents. Role Specific Competencies : Should have good data modeling skills Dashboard development Good visualization skills Admin-related activities. Knowledge of other BI & Visualization tools (e.g. Tableau, Power-BI). Optional Strong analytical and problem-solving skills. Data analysis skills Experience: 6 to 8 years For additional details regarding submission eligibility and payment terms, please refer to your contract. Only submissions from agencies with current service contracts in place will be considered.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You will be joining our team as an experienced SAP Native HANA Developer for one of our clients. Your primary responsibility will be developing solutions using SAP Native HANA. As a SAP Native HANA Developer, you should have a minimum of 4 to 6+ years of experience in this field. You will be based in Noida for this onsite position, and we prefer candidates who can join immediately or within 15 days. Your key skills should include hands-on experience in SAP Native HANA development, a strong understanding of SQL Script, Calculation Views, and Procedures, as well as expertise in data modeling and performance tuning. Additionally, familiarity with SAP integration and reporting tools is highly desired. We are looking for someone who can work both independently and collaboratively with business teams to deliver high-quality solutions. If you believe you are a suitable candidate for this position, please share your resume at manikanta.p@creenosolutions.com. Furthermore, if you know someone who fits this role well, feel free to refer them to us.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
indore, madhya pradesh
On-site
As a SAP HANA Developer at our Indore onsite location, you will be responsible for various key tasks related to SAP HANA systems. With a minimum of 2 to 5 years of experience and a Bachelor's degree in Computer Science or a related field, you will play a crucial role in the successful implementation, data modeling, performance optimization, support, and maintenance of SAP HANA systems. Your primary responsibilities will include participating in the design, installation, and configuration of SAP HANA systems. You will be involved in developing and executing data provisioning strategies using tools such as SLT, Data Services, and Smart Data Integration. Additionally, you will design and develop data models using SAP HANA Studio, create calculation views, attribute views, and analytic views, and implement data warehousing solutions while optimizing existing data models for performance. Performance optimization will be a key aspect of your role, where you will conduct tuning and optimization of SAP HANA systems, identify and resolve performance bottlenecks in SQL queries and data models. On the support and maintenance front, you will provide ongoing support, troubleshoot technical issues, and perform system monitoring and health checks regularly. Collaboration and communication skills will also be essential as you work closely with business stakeholders to understand requirements and deliver solutions that align with business needs. You will collaborate with other IT teams to ensure seamless integration with other SAP modules and third-party systems. Additionally, you will prepare and deliver technical documentation and training for end-users, ensuring the effective utilization of SAP HANA systems across the organization.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
punjab
On-site
The Manager - Data Governance and Data Quality position at Bunge located in Mohali, Punjab, India, requires a candidate with 8-10 years of experience in Data Governance and Data Quality. The individual will play a key role in driving the successful implementation and adoption of the Collibra Data Governance platform, with a specific focus on Collibra Data Quality. Understanding the Collibra Meta Model is essential, including assets, domains, communities, and metadata ingestion using templates. Responsibilities of the role include establishing data quality, data lineage, and metadata management processes within Collibra, along with exposure to GCP, Data Privacy, Data Domains, and APIs. The Manager will be responsible for monitoring and reporting on data governance metrics and KPIs, identifying areas for improvement, and implementing corrective actions. Effective communication and collaboration skills are crucial for working with cross-functional teams. The ideal candidate should possess a Bachelor of Engineering, Master of Computer Science, or Master of Science from premier institutes. Proficiency in Collibra stack of tools (DIC, DQ), Data Warehouse, Data Modeling, and ETL is required. The individual should demonstrate the ability to break down problems into manageable pieces, plan tasks effectively, and deliver high-quality results on time. Taking ownership of assigned tasks, driving results through high standards, and adapting to change are essential qualities for this role. Bunge, a global leader in sourcing, processing, and supplying oilseed and grain products, offers sustainable products and opportunities for farmers and consumers worldwide. With headquarters in St. Louis, Missouri, and a workforce of over 25,000 employees, Bunge operates through numerous port terminals, processing plants, grain facilities, and food production units globally.,
Posted 1 week ago
3.0 - 6.0 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize in performance scalability and reliability. This role requires strong focus and experience in multi-cloud based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV’s most complex data and software problems. You will be an engineer that is able to operate in a high performing team, that can balance high quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast paced environment. It is expected that you are a technical liaison that you can balance high quality delivery with customer focus, that you have excellent communication skills, and that you have a record of delivering results in a fast-paced environment. What you will be doing: Collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. Influence company wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. Design, implement, and maintain tools and best practices for (but not limited to) access control, data versioning, database management, and migration strategies. Contribute, influence, and set standards for all technical aspects of a product or service including, but not limited to, coding, testing, debugging, performance, languages, database selection, management and deployment. Identify and troubleshoot database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions. Write clean, maintainable, well-commented code and automation to support our data infrastructure layer. Perform code reviews, develop high-quality documentation, and build robust test suites for your products. Provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborate with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products. Collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participate in the SOX audits, including creation of standards and reproducible audit evidence through automation Create and maintain documentation for database and system configurations, procedures, and troubleshooting guides. Maintain and extend (as required) existing database operations solutions for backups, index defragmentation, data retention, etc. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Accountable for the overall performance of products and/or services within a defined area of focus. Be part of the on-call rotation. Handle multiple competing priorities in an agile, fast-paced environment. Perform additional duties as assigned What you need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment 1+ years experience architecting, developing, and delivering software products with emphasis on data infrastructure layer 1+ years work with continuous integration and build tools. 1+ years experience programing in Python 1+ years experience with Cloud platforms preferably in GCP/AWS Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Knowledge in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Hands-on skills and the ability to drill deep into the complex system design and implementation. Experience with: DevOps practices and tools for database automation and infrastructure provisioning Programming in Python, SQL Github, Jenkins Infrastructure as code tooling, such as terraform, preferred Big data technologies and distributed databases Nice to Have Qualifications: Experience with NoSQL data stores Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran Database monitoring and diagnostic tools, preferably Data Dog. Database management/administration with PostgreSQL, MySQL, Dynamo, Mongo GCP/BigQuery, Confluent Kafka Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Service Oriented Architecture/Microservices and Event Sourcing in a platform like Kafka (preferred) Familiarity with DevOps practices and tools for automation and infrastructure provisioning. Hands-on experience with SOX compliance requirements Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning and optimization techniques #LI-AM1
Posted 1 week ago
10.0 - 15.0 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize in performance scalability and reliability. This role requires strong focus and experience in multi-cloud based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV’s most complex data and software problems. You will be an engineer that is able to operate in a high performing team, that can balance high quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast paced environment. It is expected that you are a technical liaison that you can balance high quality delivery with customer focus, that you have excellent communication skills, and that you have a record of delivering results in a fast-paced environment. What you will be doing: Collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. Influence company wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. Design, implement, and maintain tools and best practices for (but not limited to) access control, data versioning, database management, and migration strategies. Contribute, influence, and set standards for all technical aspects of a product or service including, but not limited to, coding, testing, debugging, performance, languages, database selection, management and deployment. Identify and troubleshoot database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions. Write clean, maintainable, well-commented code and automation to support our data infrastructure layer. Perform code reviews, develop high-quality documentation, and build robust test suites for your products. Provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborate with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products. Collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participate in the SOX audits, including creation of standards and reproducible audit evidence through automation Create and maintain documentation for database and system configurations, procedures, and troubleshooting guides. Maintain and extend (as required) existing database operations solutions for backups, index defragmentation, data retention, etc. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Accountable for the overall performance of products and/or services within a defined area of focus. Be part of the on-call rotation. Handle multiple competing priorities in an agile, fast-paced environment. Perform additional duties as assigned What you need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment 1+ years experience architecting, developing, and delivering software products with emphasis on data infrastructure layer 1+ years work with continuous integration and build tools. 1+ years experience programing in Python 1+ years experience with Cloud platforms preferably in GCP/AWS Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Knowledge in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Hands-on skills and the ability to drill deep into the complex system design and implementation. Experience with: DevOps practices and tools for database automation and infrastructure provisioning Programming in Python, SQL Github, Jenkins Infrastructure as code tooling, such as terraform, preferred Big data technologies and distributed databases Nice to Have Qualifications: Experience with NoSQL data stores Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran Database monitoring and diagnostic tools, preferably Data Dog. Database management/administration with PostgreSQL, MySQL, Dynamo, Mongo GCP/BigQuery, Confluent Kafka Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Service Oriented Architecture/Microservices and Event Sourcing in a platform like Kafka (preferred) Familiarity with DevOps practices and tools for automation and infrastructure provisioning. Hands-on experience with SOX compliance requirements Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning and optimization techniques #LI-AM1
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Senior Power BI Developer at our company in Ahmedabad, you will play a crucial role in designing, developing, and maintaining interactive and user-friendly Power BI dashboards and reports. Your primary responsibility will be to translate business requirements into functional and technical specifications and perform data modeling, DAX calculations, and Power Query transformations. You will also be involved in integrating data from multiple sources including SQL Server, Excel, SharePoint, and APIs, optimizing Power BI datasets, reports, and dashboards for performance and usability, and ensuring security and governance best practices in Power BI workspaces and datasets. Collaboration with business analysts, data engineers, and stakeholders to ensure data accuracy and relevance is key, along with providing ongoing support and troubleshooting for existing Power BI solutions. Staying updated with Power BI updates, best practices, and industry trends will be an ongoing focus. The ideal candidate for this role should hold a Bachelors's degree in Computer Science, Information Technology, Data Analytics, or a related field with at least 4+ years of professional experience in data analytics or business intelligence. A minimum of 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service) is required, along with strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema). Proficiency in writing complex SQL queries and optimizing them for performance is essential, as well as experience in working with large and complex datasets. Familiarity with BigQuery, MySQL, and Looker Studio would be advantageous, and experience in the Ecommerce industry will be an added advantage. A solid understanding of data warehousing concepts and ETL processes is necessary, and experience with version control tools such as Power Apps & Power Automate would be a plus. Preferred qualifications for this role include Microsoft Power BI Certification (PL-300 or equivalent), experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse), knowledge of other BI tools (Tableau, Qlik), and familiarity with scripting languages (Python, R) for data analysis. Experience integrating Power BI into web portals using Power BI Embedded would also be beneficial. If you have the skills in PowerBI, DAX, Looker, MySQL, data modeling, and data visualization, and are looking to take on a challenging role that offers opportunities for growth and learning, we encourage you to apply for this exciting position.,
Posted 1 week ago
3.0 - 8.0 years
12 - 14 Lacs
Hyderabad
Hybrid
Role & responsibilities Coordinate test development and delivery processes for assigned programs. Manage schedules, tools, and workflows to ensure timely and accurate assessment production. Collaborate with internal teams and external stakeholders to support smooth program execution. Assist senior leads in tracking progress and integrating technology into assessment systems. Develop and monitor plans and processes for a more efficient operational workflow Assist in the preparation of test-related deliverables, innovative products and services, including special reports, proposals, and surveys Assist with the preparation and monitoring of schedules for producing tests and test-related deliverables Communicate with the organisations staff, vendors, and clients regarding scheduling, key due dates and deliverables, update schedules as significant changes occur May serve as program resource to clients and candidates, providing advice and interpreting program guidelines and attending meetings as requested Hold or participate in regular internal meetings with project staff to communicate information and monitor schedules Flow information to the appropriate staff within appropriate timeframes to resolve issues affecting all development Assist in the preparation of project expenses and revising monthly forecasts Adhere to ethical standards and comply with the laws and regulations applicable to your job function Preferred candidate profile Has a track record of adding value to official or unofficial teams by actively participating in them and seeking to understand the various interests of team members. Customer-focused and foster respectful relationships with internal and external colleagues. Possess the skills to plan, organize, and manage tasks and resources to accomplish a well-defined objective, within constraints of time, resources, and cost. Demonstrate a strong learning orientation to willingly develop new skills and competencies that will improve personal and business performance. Ability to problem-solve and have the flexibility to adjust project plans and schedules and adapt existing processes and procedures to meet deliverables on time and with the expected quality without negatively impacting colleagues, processes, or other deliverables. Demonstrate a high level of productivity and exhibit accountability for assigned work. Understanding and ability to build Power BI report visualizations. Understanding and ability to work in DAX and the ability to optimize performance of Power BI. Ability to work with models in Power BI and knowledge and ability to use SQL and pull that information into Power BI. Ability to complete data transformation. Bachelors Degree in field of specialty is required. This position requires at least 3 years of experience in process or project management, administration, operational or technical activities. This position requires strong verbal and written communication skills, well-developed organizational skills, strong technical skills, and a willingness/adeptness with regards to learning new technology. Effectively communicates across technical and functional teams to translate data insights into actionable business solutions.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and optimizing data models within the Celonis Execution Management System (EMS). Your duties will include extracting, transforming, and loading (ETL) data from flat files and UDP into Celonis. It is essential to work closely with business stakeholders and data analysts to understand data requirements and ensure an accurate representation of business processes. Additionally, you will be required to develop and optimize PQL (Process Query Language) queries for process mining. Collaboration with group data engineers, architects, and analysts is crucial to ensure high-quality data pipelines and scalable solutions. Data validation, cleansing, and transformation will also be part of your responsibilities to enhance data quality. Monitoring and troubleshooting data integration pipelines to ensure performance and reliability are key tasks. You will also provide guidance and best practices for data modeling in Celonis. To qualify for this role, you should have a minimum of 5 years of experience in data engineering, data modeling, or related roles. Proficiency in SQL, ETL processes, and database management (e.g., PostgreSQL, Snowflake, BigQuery, or similar) is required. Experience working with large-scale datasets and optimizing data models for performance is essential. Your data management experience must cover the data lifecycle and critical functions such as data profiling, data modeling, data engineering, and data consumption products and services. Strong problem-solving skills are necessary, along with the ability to work in an agile, fast-paced environment. Excellent communication skills and demonstrated hands-on experience in communicating technical topics with non-technical audiences are expected. You should be able to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. Excellent collaboration skills to work with cross-functional teams will also be essential for this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
As a Power BI Developer, you will be responsible for understanding business requirements in a Business Intelligence (BI) context and designing data models to transform raw data into meaningful insights. Your role will involve creating dashboards and interactive visual reports using Power BI, identifying key performance indicators (KPIs) with clear objectives, and consistently monitoring them. You will analyze data and present it through reports that aid decision-making. Additionally, you will convert business requirements into technical specifications, decide timelines for accomplishment, create relationships between data, and develop tabular and multidimensional data models. Your responsibilities will also include chart creation and data documentation, explaining algorithms, parameters, models, and relations. To be successful in this role, you should hold a Bachelor's degree in Computer Science, Business Administration, or a related field. You are required to have a minimum of 6 to 8 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets. Moreover, you should have a minimum of 6 years of Power BI development experience, SQL Server expertise, excellent Microsoft Office skills (including advanced Excel skills), strong analytical, quantitative, problem-solving, and organizational skills. Attention to detail, the ability to coordinate multiple tasks, set priorities, and meet deadlines are essential qualities for this role. As a Power BI Lead, you will have similar responsibilities to a Power BI Developer, with additional tasks such as implementing data cleansing and data quality processes to ensure the accuracy and reliability of data. You will also be skilled in Analyses Service, building Tabular & Multidimensional models (OLAP, Cubes) on top of DW/DM/DB. The required skills for this role include a Bachelor's degree in Computer Science, Business Administration, or a related field, a minimum of 6 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets, a minimum of 6 years of Power BI development experience/SQL Server expertise, excellent Microsoft Office skills including advanced Excel skills, strong analytical, quantitative, problem-solving, and organizational skills. As a Power BI Architect, you will collaborate with business stakeholders to understand their reporting and analytics requirements. Your role involves designing end-to-end Power BI solutions that encompass data modeling, data acquisition, transformation, and visualization. You will develop data integration pipelines to extract, transform, and load data from multiple sources into Power BI, implement data cleansing and data quality processes, create visually appealing and interactive reports and dashboards, design intuitive and user-friendly navigation and interaction experiences, identify and address performance bottlenecks in Power BI reports and dashboards, and optimize data models, queries, and visuals for improved responsiveness. To excel as a Power BI Architect, you should hold a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. You must have proven experience of 4 to 5 years as a Power BI Architect or in a similar role, with a focus on designing and implementing Power BI solutions. Additionally, you should have 8+ years of experience in Business Intelligence and good knowledge and prior experience in PBI services including OLS and RLS, Dataflow and datamart, Deployment pipelines, gateways. If you are a dynamic individual with energy and passion for your work, we invite you to join our innovative and collaborative organization. We offer high-impact careers and growth opportunities across global locations, with a work environment designed to help you thrive, learn, and grow through targeted learning and development programs, as well as generous benefits and perks.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
delhi
On-site
You are invited to join our team at ThoughtSol Infotech Pvt.Ltd as a skilled and experienced Power BI Developer. If you have a passion for data visualization, analytics, and business intelligence, along with a strong background in developing and implementing Power BI solutions, we are looking for you. Your responsibilities will include developing and maintaining Power BI dashboards, reports, and data visualizations to meet business requirements. You will design and implement data models, ETL processes, and data integration solutions using Power BI and related technologies. Collaborating with business stakeholders to gather requirements, understand data needs, and deliver actionable insights will be a key part of your role. It will also be essential to optimize performance and usability of Power BI solutions through data modeling, query optimization, and UI/UX enhancements. Implementing data governance and security best practices to ensure data accuracy, integrity, and confidentiality is another crucial aspect of the position. Additionally, you will provide training and support to end users on Power BI usage, best practices, and troubleshooting. Staying updated on the latest Power BI features, trends, and best practices will be necessary to recommend improvements to existing solutions. To qualify for this role, you should have a minimum of 2 years of experience and hold a Bachelor's degree in Computer Science, Information Systems, or a related field. Proficiency in Power BI Desktop, Power Query, DAX, and Power BI Service is required. A strong understanding of data warehousing concepts, data modeling techniques, and ETL processes is essential. Experience with SQL, T-SQL, and relational databases (e.g., SQL Server, MySQL, PostgreSQL) is also necessary. Familiarity with Azure services (e.g., Azure SQL Database, Azure Data Lake, Azure Analysis Services) is considered a plus. Excellent analytical, problem-solving, and communication skills are a must, along with the ability to work independently and collaboratively in a fast-paced environment.,
Posted 1 week ago
6.0 - 9.0 years
18 - 27 Lacs
Bangalore Rural, Gurugram, Bengaluru
Work from Office
Data Modelling,Star/Snowflake schema, Normalisation/Denormalization Snowflake,Schema design,Performance tuning, Time Travel, Streams & Tasks,Secure & Materialised Views,SQL & Scripting: Advanced SQL (CTEs, Window Functions), Automation & optimisation
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
You are looking for a Data Engineer with over 5 years of experience to join our team in Ahmedabad. As a Data Engineer, you will play a key role in transforming raw data into valuable insights and creating scalable data infrastructure. Your responsibilities will include designing data pipelines, optimizing data systems, and supporting data-driven decision-making. Key responsibilities of the role include: - Architecting, building, and maintaining scalable data pipelines from various sources. - Designing effective data storage, retrieval mechanisms, and data models for analytics. - Implementing data validation, transformation, and quality monitoring processes. - Collaborating with cross-functional teams to deliver data-driven solutions. - Identifying bottlenecks, optimizing workflows, and providing mentorship to junior engineers. We are looking for a candidate with: - 4+ years of hands-on experience in Data Engineering. - Proficiency in Python and data pipeline design. - Experience with Big Data tools like Hadoop, Spark, and Hive. - Strong skills in SQL, NoSQL databases, and data warehousing solutions. - Knowledge of cloud platforms, especially Azure. - Familiarity with distributed computing, data modeling, and performance tuning. - Understanding of DevOps, Power Automate, and Microsoft Fabric is a plus. - Strong analytical thinking, collaboration skills, excellent communication skills, and the ability to work independently or as part of a team. Qualifications required for this position include a Bachelor's degree in Computer Science, Data Science, or a related field. If you are passionate about data engineering and have the necessary expertise, we encourage you to apply and be a part of our innovative team in Ahmedabad.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Backend and Data Pipeline Engineer specializing in Python, you will be part of a dynamic team focused on leveraging cutting-edge technology to develop new products that drive growth and transformation for our customers. You will have the opportunity to work on data integration, advanced analytics, and modern applications that cater to evolving customer needs and play a strategic role within the organization. Your contributions will be instrumental in delivering innovative solutions to complex problems and driving our business strategies forward. You will be at the forefront of innovation, particularly in the realm of Data Products, where you will be involved in creating automotive forecasting solutions that provide a competitive advantage to our business and valuable insights to our clients. In this role, you will design, develop, and maintain scalable data pipelines with complex algorithms, ensuring data quality and integrity through robust validation processes. Additionally, you will build and manage UI backend services using Python or similar languages, collaborate with cross-functional teams to gather data requirements, and optimize data flow and storage for advanced analytics in collaboration with data scientists and analysts. As a key member of the team, you will lead data integration projects, mentor junior engineers, and take ownership of the modules you are working on, ensuring timely delivery with high quality and adherence to best practices in software development. Proficiency in Python, experience with Flask for backend development, and strong knowledge of object-oriented programming are essential for this role. To excel in this position, you should hold a Bachelor's degree in computer science or a related field, possess strong analytical and problem-solving skills, and have at least 7 years of experience in Data Engineering/Advanced Analytics. Additionally, AWS proficiency, particularly in ECR and Containers, is highly desirable. If you are an innovative and mission-driven individual with a passion for leveraging cloud-native solutions to forecast trends in the automotive industry, and thrive in a fast-paced and collaborative work environment, we invite you to be a part of our team and make a significant impact with your technical expertise and problem-solving abilities.,
Posted 1 week ago
5.0 - 9.0 years
9 - 12 Lacs
Gurugram, Bengaluru
Work from Office
SQL developer with strong knowledge of SQL Programming and commands, querying, aggregation, joining and manipulating data. ETL, Data transformation and data warehousing, Data Manipulation, Data Visualisation
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Bengaluru
Work from Office
Description About the Job Enphase Energy is a global energy technology company and a leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, our innovative microinverter technology revolutionized solar power, making it a safer, more reliable, and scalable energy source. Today, the Enphase Energy System enables users to make, use, save, and sell their own power. Enphase is also one of the most successful and innovative clean energy companies in the world, with more than 80 million products shipped across 160 countries. What you'll Be Doing Analyze real-world eet data to develop predictive failure models and assess long-term product perf ormance. Lead investigations into product failures, document root cause ndings, and drive containment and corrective actions. Represent the Failure Analysis (FA) team in Failure Review Boards, collaborating closely with engineering and quality teams. De ne appropriate analysis methodologies and manage external FA vendors where applica ble. Identify opportunities to reduce FA cycle time and proactively discover emerging failure mode s. Leverage online diagnostic systems to remotely analyze product issues and contribute to predictive and autonomous eet recovery strategies. Engage with software, hardware, and product teams to understand upcoming releases, identify gaps, and develop robust data models. Collaborate with internal stakeholders to identify opportunities for data-driven improvements in product design, reliability, and manufacturing. Troubleshoot eld upgrade and deployment issues, coordinate escalations, and implement model updates and data quality measures. Lead the design and evolution of data platforms and visualization frameworks to support scalable analysis and modern data science work ows. Promote and enhance the culture of safety across engineering and diagnostics teams. Work closely with CS/ Engineering/ Quality to ensure rapid turnaround of returned units and accelerate root cause diagnostics. Operate within a diverse, multi-cultural, and global team environment, driving innovation through inclusion. Enphase Con den al Who You Are and What You Bring Educational Bac kground : o Bachelor s degree in Electrical Engineering, EEE, or ECE from a top 100 NIRF institute with 3 4 years of relevant experience, or o Master s degree in Power Electronics, Power Systems, or ECE from a top 100 NIRF institute with 2+ years of proven experience in failure analysis and data analytic s. Technical Skills : o Strong hands-on experience in failure analysis of power electronics, energy device s. o o Familiarity with power supplies, inverters, or solar systems is a strong plus. Pro ciency in Python for data analysis using libraries such as Pandas, NumPy, Scikit-learn, Seaborn, Matplotlib, or Plotly. o o Solid understanding of statistical techniques, data modeling, and quality control practic es. Experience using data visualization and BI tools (eg, Excel, Incorta). Soft Skills : o Excellent verbal and written communication skills, with the ability to translate technical insights for diverse audiences. o o o Detail-oriented with a structured approach to problem-solving. Strong collaboration skills and the ability to thrive in cross-functional teams. Passionate about sustainability, reliability, and continuous improvement. Bonus Quali ca tions : o Working knowledge of electrical safety practices for batteries, microinverters, or industrial gateways. o Safety certi cations or accreditations in electrical domains are highly desirable. Enphase Con den al
Posted 1 week ago
7.0 - 12.0 years
45 - 50 Lacs
Bengaluru
Work from Office
As an MTS-4 engineer, you'll be both a systems builder and a data problem solver designing large-scale services, leading technical direction, and building tools that extract actionable insights from telemetry and logs. This is a high-leverage role for someone who wants to build at scale, solve tough data problems, and make a measurable difference to Pure s customers. WHAT you'll BE DOING Architect and implement backend services for ingesting, extracting, and classifying telemetry logs at scale (eg, TALES platform). Design data pipelines to process and analyze structured and unstructured logs, enabling real-time fleet intelligence and RCA automation. Build scalable distributed systems using technologies like Redis, SQS, RDS, and S3 to support fault-tolerant, stateless services. Develop and deploy microservices on cloud-native platforms (eg, AWS , Pure1Build ) using containerized infrastructure (Docker, Kubernetes). Establish and maintain robust observability define SLOs, build health checks, monitor job queues, and implement alerting using tools like Datadog and PagerDuty. Create and optimize data models for failure classification, metrics aggregation, and long-term log storage using SQL-based stores (PostgreSQL, Snowflake). Mentor junior engineers , drive technical reviews, and write high-quality design docs that guide the team s architecture decisions. Collaborate closely with platform, support, and product teams to align on service APIs, failure workflows, and operational ownership. Continuously improve system reliability, cost-efficiency, and developer experience through refactoring, automation, and design iteration. WHAT you'll NEED TO BRING TO THIS ROLE... 7+ years of experience in software engineering and/or data engineering, building and operating backend services in Python, Go, or Java . Strong knowledge of distributed systems fundamentals , service-oriented architecture, and asynchronous workflows. Hands-on experience building data pipelines or log analytics systems using cloud-native tools (eg, AWS Glue, Lambda, S3, Athena, Kafka, or equivalent). Deep understanding of database systems SQL and NoSQL combined with strong data modeling and performance optimization skills. Cloud infrastructure expertise, ideally in AWS , including IAM, cost-aware architecture, and autoscaling principles. Familiarity with CI/CD pipelines , microservices lifecycle management, and containerization with Docker/Kubernetes. Demonstrated ability to design and lead complex, cross-functional technical initiatives with minimal oversight. Excellent communication skills able to explain design trade-offs, simplify complex systems, and influence across teams. A strong mentoring mindset and a willingness to help level-up peers and junior team members. bachelors or masters degree in Computer Science, Data Engineering, or a related field. WHAT YOU CAN EXPECT FROM US: Pure Innovation : We celebrate those who think critically, like a challenge and aspire to be trailblazers. Pure Growth : We give you the space and support to grow along with us and to contribute to something meaningful. We have been Named Fortunes Best Large Workplaces in the Bay Area , Fortunes Best Workplaces for Millennials and certified as a Great Place to Work ! Pure Team : We build each other up and set aside ego for the greater good
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France