Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
3 - 7 Lacs
Noida
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Principal Consultant- Qlik and Tableau Developer ! We are looking for a talented and experienced Qlik Developer to join our team. The ideal candidate will have a strong background in data analytics and visualization , hands -on experience in Qlik. Additionally, proficiency in Power BI or Tableau is required to support our data visualization needs. Key Responsibilities: Develop, design, and maintain Qlik applications to support data analysis and reporting. Collaborate with business stakeholders to understand data requirements and deliver effective data solutions. Create and optimize complex data models and visualizations, ensuring data accuracy and efficiency. Utilize Power BI or Tableau to develop interactive and insightful dashboards and reports. Perform data validation and quality checks to ensure data integrity. Troubleshoot and resolve issues related to Qlik applications and data visualizations. Stay updated with the latest trends and best practices in data analytics and visualization tools. Provide training and support to team members on Qlik and data visualization tools. Qualifications we seek in you! Minimum Qualifications Bachelor's degree in Computer Science , Information Technology, Data Science, or a related field. experience working with Qlik. Proficiency in Power BI or Tableau for data visualization. Strong understanding of data modeling and ETL processes. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication and interpersonal skills. Preferred Qualifications: Experience with other data analytics tools and programming languages (e.g., SQL, Python). Certification in Qlik and/or Power BI/Tableau. Experience in a similar industry or domai n Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 24, 2025, 3:27:31 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
8.0 years
4 - 4 Lacs
Noida
On-site
Assistant Vice President EXL/AVP/1378083 Human ResourcesNoida Posted On 05 Jun 2025 End Date 20 Jul 2025 Required Experience 8 - 16 Years Basic Section Number Of Positions 1 Band D1 Band Name Assistant Vice President Cost Code G050403 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type Backfill Max CTC 3500000.0000 - 4500000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Enabling Sub Group Human Resources Organization Human Resources LOB Human Resources SBU Corporate HR Services - Compensation & Benefits Country India City Noida Center IN Noida SEZ C60 Skills Skill COMPENSATION BENCHMARKING POWERPOINT STAKEHOLDER MANAGEMENT EXCEL PROFICIENCY RESEARCH & ANALYTICS LABOUR LAW Minimum Qualification B.COM MBA/PGPM Certification No data available Job Description Job Title:vin Geography Rewards Lead (South Africa) Job Category: Permanent Department/Group: HR Reporting: Dotted line - Regional Head of Rewards (SA & EU) Solid line – Geo Head of HR (SA) Location: NCR, Delhi Level: C2/D1 Scope / Span 4500+ Employees Position Type: Full Time Roles & Responsibilities Responsible for designing and implementing all aspects of the employee rewards programs for South Africa, including compensation, benefits and short -term incentives. The incumbent will also, in partnership with the Global Rewards team, lead, review and design to ensure that the C&B practices of the organization are internally equitable, market competitive and aligned with the company's performance and affordability. The role will have a deep understanding of external best-practices and reward strategies combined with a track record of conceptualizing successful strategies, developing and leading action plans, and the ability to execute and operationalize in a scalable way to amplify Business productivity and sustain a high performing culture that attracts, motivates and retains talent. Work with Corporate Rewards Team on Benchmarking of salary/benefits data and participation in regular/industry/forum level surveys. Analyze and share reports/insights to Leadership/Top Management. Monitor, evaluate and design Hiring Ranges with Recruitment, HRBP and Business. Will work closely with HR leaders and C&B on promotions, market corrections, Geo mobility, other case etc. on the compensation related recommendations. Design, monitor and manage the Benefits . Lead strategy and design to deliver quality benefit programs to remain competitive and cost-effective In partnership with our brokers, participate in negotiation of agreements with insurance carriers and financial institutions for administration of benefits programs Calibrate with Corporate Rewards Team on annual processes like Merit Increase , Bonus and Equity Grants. Evaluate effectiveness of STI (Bonus, P4P, sales Commission)/LTI (Equity) plans. Track and work on DEI, Pay Equity and Gender Pay analysis with the corporate Team. Liaise with Compliance/Legal team and ensure compliance with regulations and company policies in compensation decisions. Design, upkeep and update C&B policies. Ensure effective and timely communications to employees. Partner with Global Rewards team to develop monthly/quarterly dashboards and analysis. Work closely with Corporate Rewards and finance team on Budgeting and Pricing of jobs. Manage the data processes required by internal and external sources, including resolving data errors and partnering with HR shared services as needed. Prepare reports and analysis of compensation data. Partner with HR Business Partners and Management to review findings and recommendations Partner with other HR team members/leaders and functional areas to investigate and resolve escalated employee issues (Compensation matters). Collaborate with HRIS/HRSS/DHRO to maintain systems related to compensation and benefits administration. Competencies, Skills and Values & Behavior Knowledge of the industry, local legislation (South Africa) and HR statutory compliances Should be proficient with MS excel, PowerPoint and Word. Strong data acumen & decision-making ability, Relationship Management &Strategic and commercial thinking. Experience implementing cost-saving methods and improving operational efficiency. Analytical skills with ability to use data proactively to address opportunities. Ability to communicate in a clear and confident manner. Ability to elicit cooperation from teams, management, and external stakeholders. Experience building and cultivating effective working relationships with multiple stakeholders (internal & external) Prior knowledge of Oracle Fusion and Power BI is desired. Understanding of tools like Tableau, Power BI, Qlik sense etc. is desired. Minimum Requirements 8+ years’ experience in Compensation and Benefits with minimum 3-5 years of leading the function or Geo (South Africa). ITES/KPO/Analytics Industry background Should have strong analytical and advanced Excel skills. Excellent written and verbal communication and influencing skills with strong presentation skills. Strong stakeholder relationships/other partnerships experience within a global setting. Workflow Workflow Type Back Office
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from Sona Comstar!!! Please find the job Description in below. Company Details: Sona BLW Precision Forgings Ltd is an automotive technology company in India. It is primarily engaged in designing, manufacturing, and supplying high-quality mission-critical automotive components such as differential assemblies, gears, conventional and micro-hybrid starter motors, and others. Headquartered in Gurugram, India with 4,500 Employees, the company has 2 Business Verticals Driveline Business & Motor Business. With 10 Manufacturing Facilities in 4 Countries India, USA, China, Mexico the company focuses on Conventional Products like Differential Bevel Gears, Differential Assemblies, Portal Axle Gears, Starter Motors and EV Focused Products like Differential Assemblies, Reduction Gears, Traction Motors, Motor Controllers, E-Axles, Integrated Motor Controller Module. Our company is certified for Leading Industry Certifications like IATF 16949, ISO 14001, ISO 45001, ISO 50001, TPM, ENMS, OHSAS 18001, ASES, VQE. With significant production volumes since inception and leading market shares in our products we are supplying to most of the major global OEMs, geographically deriving most of the revenue from Outside India. Company Website: https://sonacomstar.com/ Experience: 3+ Years Qualification: B.E/B. Tech Location: Chennai – Padur Job Description: Position Responsibilities: Develop Python scripts for data manipulation, automation, and creating machine learning models. Implement and maintain AI/ML algorithms such as regression, classification, clustering, and neural networks. Build and deploy machine learning models using libraries like TensorFlow, Keras, scikit-learn, and PyTorch. Utilize Power BI to create interactive dashboards, reports, and data visualizations to communicate insights. Collaborate with cross-functional teams to gather requirements, develop data pipelines, and ensure effective use of ML models in business processes. Conduct linear regression analysis to identify trends and relationships in large datasets. Continuously optimize and improve existing machine learning models for better performance and scalability. Provide regular updates on model performance, conduct testing, and fine-tune the algorithms for enhanced accuracy. Document code and model development process to maintain transparency and support ongoing team development. Required Experience: 3-5 years of experience in Python programming, specifically for data analysis, automation, and machine learning. Strong knowledge of machine learning algorithms (e.g., linear regression, decision trees, random forests, SVM, etc.). Hands-on experience with Power BI for developing dashboards, reports, and business analytics solutions. Experience with data pre-processing, feature engineering, and model evaluation. Solid understanding of statistics and linear regression analysis, including interpreting results and deriving actionable insights. Familiarity with SQL and database management systems to extract, clean, and analyze data. Proven ability to work collaboratively in a team environment and communicate technical results to non-technical stakeholders. Desired Experience: Knowledge of big data technologies for handling large-scale datasets. Familiarity with business intelligence tools like Tableau, Qlik, or others, in addition to Power BI. Ability to apply advanced statistical techniques to improve model accuracy (e.g., regularization, cross-validation). Experience in deploying ML models to production environments and monitoring their performance over time. KEERTHANA S 7845793459
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Corporate Trainer – Data, Analytics & BI (Full-Time) Base Location: Nagpur or Pune (with global travel as needed) Employment Type: Full-Time Communication: Excellent English communication skills — you’ll be engaging global audiences Ready to lead the future of corporate learning? At Nice Software Solutions Pvt Ltd, we’re looking for a dynamic, passionate Corporate Trainer who can deliver world-class training in data, analytics, and BI technologies (Python, SQL, Power BI and more). You’ll work with global clients, design custom programs, and empower teams to make smarter, data-driven decisions. If you can own a room — virtual or in-person — and love making tech engaging, this is your opportunity. What you’ll do Deliver corporate training sessions (minimum 10 days/month) for international clients Design custom training programs and support pre-sales conversations Develop engaging content — e-learning modules, manuals, presentations Lead and mentor the training team, building a high-performance culture Collaborate with clients to understand training needs, plan delivery, and ensure ROI Stay ahead with emerging tools, technologies, and best practices Travel globally as per training schedules What we’re looking for Strong personality with excellent interpersonal and presentation skills Proven B2B corporate training experience (data, analytics, BI) Flexibility for international travel Confidence in pre-sales conversations and customizing training plans Continuous learning mindset Ability to lead and collaborate effectively Why Nice Software Solutions Since 2008, we’ve delivered 600+ projects to 100+ clients across the globe. Our tech stack spans Power BI, Qlik, ThoughtSpot, Appian, Azure, Snowflake, Databricks, MicroStrategy and more. We offer: Innovative, high-impact projects Global exposure Continuous upskilling and professional growth We don’t just teach analytics — we live it.
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsibilities: Provide data analytics, risk management and IT audit support during business development pursuits; e.g. proposals, cost build-ups, sales meetings Identify, prioritize and execute on high-value opportunities to improve data risk services methodologies; including developing and delivering training, whitepapers, and desktop procedures for best-practice evaluation methods by business application (prioritization on Oracle Fusion, SAP ECC and SAP S/4HANA, Microsoft D365, Workday, NetSuite and other tier 1 business applications Identify and prioritize high-value opportunities to improve audit and compliance processes through analytics and automation, particularly in areas unique to Data GRC (e.g., metadata management, master data management, data lineage capture and mapping, risk and controls design and testing, upstream and downstream data quality and accuracy validations, etc.) Responsible for developing and implementing data analytics solutions, including creating dashboards and reports. This role requires technical expertise to directly build and manage analytics. The specialist will actively engage in data analysis, build visualizations, and provide actionable insights to support decision-making. Upskill and train more junior staff on best practices and approach to data and risk management, including risk management and internal audit basics, analytics and automation. Responsible for execution and review of all work-papers and deliverables, including reporting to client stakeholders. Provide guidance to other internal and external stakeholders (clients, industry events, market events, etc.) on related data risk, analytics best practices Facilitate sessions with internal and external personnel to effectively design methodology that: a) help audit/compliance professionals learn more about the business in order to better focus attention on the areas of highest risk, and b) identify issues and potential process exceptions Manage communication with IT and/or business resources to locate internal and external data for analysis, understand data, and make data requests or direct connections to databases Champion sustainable data risk, analytics and automation design concepts Manage the development of visualization, dashboards and scripts, using agile development methodology Perform quality assurance over developer practices for data mapping, data transformations, data joining/blending, data quality, data cleansing, and other data movement related activities Provide guidance to both internal and external stakeholders on interpreting analytic results Coordinate data risk services with off-shore resources at the RSM Delivery Center in India and El Salvadore Be an active participant in local employee network groups and build relationships with RSM members across all lines of business and consulting as representing practice services and capabilities Position Requirements: Experience working with a team to provide services to numerous clients simultaneously Project and program management expertise and strong written and verbal communication skills Detail-oriented with a pro-active, inquisitive and creative approach to work, preferred to be analytics and technology inclined Experience as an auditor or supporting internal or external audit teams with fundamental understanding of enterprise risk management and compliance and/or best practice frameworks such as COSO, Sarbanes-Oxley (SOX), COBIT, etc. Understanding basic accounting, operations and auditing concepts and reporting skills, including documentation requirements Understanding and ability to describe the flow of typical business processes, covering the purchase-to-pay, order-to-cash, and record-to-report cycles, at a minimum. Understanding of automation capabilities, such as robotic process automation, machine learning, natural language processing, application programming interfacing, process mining, etc. Minimum Qualifications: Undergraduate degree in Accounting, Management Information Systems, Computer Science, or equivalent level of education Minimum of 3 years in IT audit and/or compliance with expertise in key reporting testing and experience in testing IT application controls, business process controls, and IT general controls Minimum of 3 years’ experience in technical analytics using analytics and cleansing tools such as Alteryx. Minimum of 3 years in public accounting in audit or risk advisory services capacity CPA, CISA, CIA or other related certification Preferred Qualifications: Experience with data analytics of large ERP applications such as MS D365, SAP, Oracle, NetSuite and Workday. Hands-on experience using audit-focused GRC technologies such as AuditBoard, ServiceNow, TeamMate, Idea, and WDesk. Experience using other industry standard data analysis technologies such as Alteryx, SAS, SQL, and/or Python Experience developing and/or managing dashboard solutions created using Power BI, Tableau, Qlik, or similar technologies Experience with process mining using tools like Celonis or ABBYY Timeline Experience working with automations software such as Power Automate, Automation Anywhere and UiPath. Experience working with data from cloud-based applications like Workday, NetSuite, Salesforce, Concur is a plus Business development experience is a plus Certifications in one or more data analysis technologies such as Alteryx, UiPath, Tableau, or Power BI Standards of Performance: Data stewardship - Maintain confidentiality, integrity and availability of information with your custody A self-starter with a process improvement mentality who is hands on, results-oriented, and leads by example A strong entrepreneurial spirit with the highest levels of professional and personal honestly, integrity and ethics Excellent organizational skills and the ability to prioritize multiple tasks, projects and assignments Ability to interact with all levels of client staff, including executives and senior managers Possess strong business ethics and willingness to adhere to stringent professional standards Ability to put forth additional effort to meet deadlines when necessary Ability to travel to the local office at least 3 days per week
Posted 1 month ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
We have an immediate opportunity for " Project Manager " with our client. Interested candidates send me your CV to prave.p@lancesoft.com Position: Project Manager Duration: 6 Month Location: India(Remote) Operating Environment, Framework and Boundaries, Working Relationships Lead and influence others including those more senior on best practices related to data engineering including SQL and BI reporting such as Qlik, Power BI. Experience on core banking domain will be preferred. Knowledge, Skills and Experience At least 15-20+ years of experience in Project Management and at least 2-years’ experience as Scrum Master in an Agile project. Must complete at least 20+ Agile project cycle Must have used the agile platform such as JIRA, Azure DevOps etc. SQL, database working experience in preferred. Attained the Certified Scrum Master or an equivalent qualification. Good knowledge of the commonly used agile tools and practices. Team player, Supervisory, Coaching, planning, and management skills. Strong written and communication skills. Strong inter-personal relationship. Interested candidates send me your Cv along with below details: Expected salary: Visa/ Work Permit: Notice Period: Current Location:
Posted 1 month ago
9.0 years
0 Lacs
Ernakulam, Kerala, India
On-site
Job Description Position: AI Architect -PERMANENT Only Experience: 9+ years (Relevant 8 years is a must) Notice Period: Immediate to 45 days Key Skills: Python, Data Science (AI/ML), SQL Location- TVM/Kochi/Hybrid Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Responsibilities Perform testing of IT Application Controls/ITAC/Automated controls, IPE, and Interface Controls through code reviews, IT General Controls/ITGC/GITC review covering areas such as Change Management, Access Management, Backup Management, Incident and Problem Management, SDLC, Data Migration, Batch Job scheduling/monitoring and Business Continuity and Disaster Recovery Perform Risk Assessment, identification, and Evaluation of Controls, prepare process flow diagrams and document the same in Risk & Control Matrix. Perform business process walkthrough and controls testing for IT Audits. Performing planning and executing audits, including - SOX, Internal Audits, External Audits Conducting controls assessment in manual/ automated environment Prepare/Review of Policies, Procedures, SOPs Maintain relationships with client management and the project Manager to manage expectations of service, including work products, timing, and deliverables. Demonstrate a thorough understanding of complex information systems and apply it to client situations. Use extensive knowledge of the client's business/industry to identify technological developments and evaluate impacts on the work to be performed. Coordinate effectively and efficiently with the Engagement manager and the client management keeping both constantly updated regarding project’s progress. Collaborate with other members of the engagement team to plan the engagement and develop relevant workpapers/deliverables. Perform fieldwork and share the daily progress of fieldwork, informing supervisors of engagement status. Qualifications MBA/Mtech/MS full time with minimum 3 year experience. IT Audit + SAP experience with knowledge of IT governance practices Prior IT Audit knowledge in areas of ITGC, ITAC (application/automated controls) SOX 404, SOC-1 and SOC-2 Audits Good to have knowledge of other IT regulations, standards and benchmarks used by the IT industry (e.g. NIST, PCI-DSS, ITIL, OWASP, SOX, COBIT, SSAE18/ISAE 3402 etc.) Technical Knowledge of IT Audit Tools with excellent knowledge of IT Audit process and methodology Exposure to Risk Management and Governance Frameworks/ Systems will be an added advantage Exposure to ERP systems will be added advantage Strong project management, communication (written and verbal) and presentation skills Knowledge of security measures and auditing practices within various applications, operating systems, and databases. Strong self-directed work habits, exhibiting initiative, drive, creativity, maturity, self-assurance, and professionalism Preferred Certifications – CISA/CISSP//CISM Exposure to automation Data Analytics tools such as QlikView/Qlik sense, ACL, Power BI will be an advantage Proficiency with Microsoft Word, Excel, Visio, and other MS Office tools Equal Opportunity Employer KPMG India KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we’ll give you what you need to make it happen. It won’t always be easy, growing takes grit. But at ABB, you’ll never run alone. Run what runs the world. This Position Reports To HR Operations Team Lead Your Role And Responsibilities People Analytics organization is a Global function that works with various ABB business divisions and countries delivering operational and expert services to ABB’s HR community. We aim to unlock the potential of data to help ABB business leaders and managers take better decisions which will enable us to build a more sustainable and resource-efficient future in electrification and automation. In this role, you will have the opportunity to partner with senior stakeholders to help them conceptualize KPIs in areas of Talent Performance Culture and build statistical and analytics solutions from scratch, and have a bias towards creating clean, effective, user-focused visualizations to deliver actionable insights and analysis using technologies that would vary based on purpose from Python, Snowflake, Power BI, Advanced Excel, VBA, or any other new technology. The work model for the role is: Hybrid This role is contributing to the People Analytics function supporting various business function based out in Bangalore. You Will Be Mainly Accountable For Capably interacting and managing global ABB leadership to seek and provide meaningful and actionable insights in all interactions Responsible for on time delivery of actionable insights by requirement gathering, data extraction to reporting/ presenting the findings in IC role or with the team as per project needs You are to be constantly on the looking out for ways to enhance value for your respective stakeholders/clients Developing frameworks, plug n play solutions using diagnostic, predictive and machine learning techniques on Snowflake/ Python Executing strategic projects to help ABB improve excellence in people, performance, and culture Qualifications For The Role Bachelor’s/master’s degree in applied Statistics/Mathematics, Engineering, Operations Research or related field. At least 3 - 5 years of experience in consulting, shared services or software development with proficient data analysis techniques using technologies like Excel, VBA Scripting, Python, Snowflake, Understanding of SAP/ Workday HCM/ Snowflake system is preferred. Candidate should have a motivated mindset with advanced quantitative skills, with an ability to work on large datasets. The candidate should be able to generate actionable insights from data analysis that translate to valuable business decisions for the client. Capable EDA practitioner with extensive knowledge in advanced Excel functionalities Experience in designing and maintaining dashboards/ reports providing diagnostic and forecasting view using VBA, PowerBI, Qlik, Tableau Collaborative worker with excellent collaboration skills required to work in a global virtual work environment: team-oriented, self-motivated and able to lead small to mid-size projects What's In It For You We empower you to take initiative, challenge ideas, and lead with confidence. You’ll grow through meaningful work, continuous learning, and support that’s tailored to your goals. Every idea you share and every action you take contributes to something bigger. More About Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics #ABBCareers #RunwithABB #Runwhatrunstheworld We value people from different backgrounds. Could this be your story? Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. #RunWhatRunsTheWorld Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website https://global.abb/group/en/careers and apply. Please refer to detailed recruitment fraud caution notice using the link https://global.abb/group/en/careers/how-to-apply/fraud-warning. 96249028
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we’ll give you what you need to make it happen. It won’t always be easy, growing takes grit. But at ABB, you’ll never run alone. Run what runs the world. This Position Reports To HR Operations Team Lead Your Role And Responsibilities People Analytics organization is a Global function that works with various ABB business divisions and countries delivering operational and expert services to ABB’s HR community. We aim to unlock the potential of data to help ABB business leaders and managers take better decisions which will enable us to build a more sustainable and resource-efficient future in electrification and automation. In this role, you will have the opportunity to partner with senior stakeholders to help them conceptualize KPIs in areas of Talent Performance Culture and build statistical and analytics solutions from scratch, and have a bias towards creating clean, effective, user-focused visualizations to deliver actionable insights and analysis using technologies that would vary based on purpose from Python, Snowflake, Power BI, Advanced Excel, VBA, or any other new technology. The work model for the role is: Hybrid This role is contributing to the People Analytics function supporting various business function based out in Bangalore. You Will Be Mainly Accountable For Capably interacting and managing global ABB leadership to seek and provide meaningful and actionable insights in all interactions Responsible for on time delivery of actionable insights by requirement gathering, data extraction to reporting/ presenting the findings in IC role or with the team as per project needs You are to be constantly on the looking out for ways to enhance value for your respective stakeholders/clients Developing frameworks, plug n play solutions using diagnostic, predictive and machine learning techniques on Snowflake/ Python Executing strategic projects to help ABB improve excellence in people, performance, and culture Qualifications For The Role Bachelor’s/master’s degree in applied Statistics/Mathematics, Engineering, Operations Research or related field. At least 3 - 5 years of experience in consulting, shared services or software development with proficient data analysis techniques using technologies like Excel, VBA Scripting, Python, Snowflake, Understanding of SAP/ Workday HCM/ Snowflake system is preferred. Candidate should have a motivated mindset with advanced quantitative skills, with an ability to work on large datasets. The candidate should be able to generate actionable insights from data analysis that translate to valuable business decisions for the client. Capable EDA practitioner with extensive knowledge in advanced Excel functionalities Experience in designing and maintaining dashboards/ reports providing diagnostic and forecasting view using VBA, PowerBI, Qlik, Tableau Collaborative worker with excellent collaboration skills required to work in a global virtual work environment: team-oriented, self-motivated and able to lead small to mid-size projects What's In It For You We empower you to take initiative, challenge ideas, and lead with confidence. You’ll grow through meaningful work, continuous learning, and support that’s tailored to your goals. Every idea you share and every action you take contributes to something bigger. More About Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics #ABBCareers #RunwithABB #Runwhatrunstheworld We value people from different backgrounds. Could this be your story? Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. #RunWhatRunsTheWorld Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website https://global.abb/group/en/careers and apply. Please refer to detailed recruitment fraud caution notice using the link https://global.abb/group/en/careers/how-to-apply/fraud-warning. 96249028
Posted 1 month ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Perform development and support activities for the data warehousing domain using ETL tools and technologies. Understand high-level design, application interface design, and build low-level design. Perform application analysis and propose technical solutions for application enhancement or resolve production issues. Perform development and deployment tasks, including coding, unit testing, and deployment. Create necessary documentation for all project deliverable phases. Handle production issues (Tier 2 support, weekend on-call rotation) to resolve production issues and ensure SLAs are met. Technical Skills: Mandatory: Working experience in Azure Databricks/PySpark. Expert knowledge in Oracle/SQL with the ability to write complex SQL/PL-SQL and performance tune. 2+ years of experience in Snowflake. 2+ years of hands-on experience in Spark or Databricks to build data pipelines. Strong experience with cloud technologies. 1+ years of hands-on experience in development, performance tuning, and loading into Snowflake. Experience working with Azure Repos or GitHub. 1+ years of hands-on experience with Azure DevOps, GitHub, or any other DevOps tool. Hands-on experience in Unix and advanced Unix shell scripting. Open to working in shifts. Good to Have: Willingness to learn all data warehousing technologies and work outside of the comfort zone in other ETL technologies (Oracle, Qlik Replicate, Golden Gate, Hadoop). Hands-on working experience is a plus. Knowledge of job schedulers. Behavioral Skills: Eagerness and hunger to learn. Good problem-solving and decision-making skills. Good communication skills within the team, site, and with the customer. Ability to stretch working hours when necessary to support business needs. Ability to work independently and drive issues to closure. Consult when necessary with relevant parties, raise timely risks. Effectively handle multiple and complex work assignments while consistently delivering high-quality work. Matrix is a global, dynamic, and fast-growing leader in technical consultancy and technology services, employing over 13,000 professionals worldwide. Since its founding in 2001, Matrix has expanded through strategic acquisitions and significant ventures, cementing its position as a pioneer in the tech industry. We specialize in developing and implementing cutting-edge technologies, software solutions, and products. Our offerings include infrastructure and consulting services, IT outsourcing, offshore solutions, training, and assimilation. Matrix also proudly represents some of the world's leading software vendors. With extensive experience spanning both private and public sectors—such as Finance, Telecom, Healthcare, Hi-Tech, Education, Defense, and Security—Matrix serves a distinguished clientele in Israel and an ever-expanding global customer base. Our success stems from a team of talented, creative, and dedicated professionals who are passionate about delivering innovative solutions. We prioritize attracting and nurturing top talent, recognizing that every employee’s contribution is essential to our success. Matrix is committed to fostering a collaborative and inclusive work environment where learning, growth, and shared success thrive. Join the winning team at Matrix! Here, you’ll find a challenging yet rewarding career, competitive compensation and benefits, and opportunities to be part of a highly respected organization—all while having fun along the way. To Learn More, Visit: www.matrix-ifs.com EQUAL OPPORTUNITY EMPLOYER: Matrix is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind. Matrix is committed to the principle of equal employment opportunity for all employees, providing employees with a work environment free of discrimination and harassment. All employment decisions at Matrix are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, family or parental status, or any other status protected by the laws or regulations in our locations. Matrix will not tolerate discrimination or harassment based on any of these characteristics. Matrix encourages applicants of all ages .
Posted 1 month ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position Business Analyst BU Financial Management- Business Finance (Corporate) Objective To prepare and provide various reports and insights to aide senior management in strategic decision making Responsibilities Preparing and analyzing profitability statements, trends in Business performance across products / teams / geography for Management decision making. Engage in product profitability with various product groups such as Trade and CMS Services, Treasury, etc. Preparing & analyzing Business and Product KPI development and measurement. Prepare Budgets for Corporate Business segments & Products in co-ordination with various stakeholders. Projects – FTP Implementation as per policy, Automation of dashboards, data enhancements and process improvements. Prepare theme-based analysis for Senior Management & other stakeholders. Manage and resolve Internal Audit queries Essential competencies 2+ years’ experience preferably from banking industry Conceptual understanding of wholesale banking products Experience in profitability analysis of Banking products in Corporate Lending, Trade Services, Treasury etc Proficiency in Microsoft Excel & exposure to MIS automation needs etc. Proficiency in providing accurate information with insights in a fast-paced & decision centric environment. Familiarity of reporting tools – Qlik, Tableau, etc Good communication and presentation skills Qualifications MBA/CA
Posted 1 month ago
6.0 - 11.0 years
15 - 27 Lacs
Bengaluru
Hybrid
Primary Responsibilities: Develop visual reports, dashboards and KPI scorecards using Power BI desktop. Build Analysis Services reporting models. Connect to data sources, importing data and transforming data for Business Intelligence. Implement row level security on data and understand application security layer models in Power BI. Integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation. Use advance level calculations on the data set. Design and develop Azure-based data centric application to manage large healthcare data application Design, build, test and deploy streaming pipelines for data processing in real time and at scale Create ETL packages Make use of Azure cloud services in ingestion and data processing Own feature development using Microsoft Azure native services like App Service, Azure Function, Azure Storage, Service Bus queues, Event Hubs, Event Grid, Application Gateway, Azure SQL, Azure DataBricks, etc Identify opportunities to fine-tune and optimize applications running on Microsoft Azure, cost reduction, adoption of best cloud practices, data and application security covering scalability and high availability Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects of Microsoft Azure Focus on automation using Infrastructure as a Code (IaaC), Jenkins, Azure DevOps, Terraform, etc. Communicate effectively with other engineers and QA Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Senior Cloud Data Engineer Position with about 7+ Years of hands-on technical Experience in the Data processing, reporting and Cloud technologies. Working Knowledge of executing the projects in the Agile Methodologies. 1. Required Skills 1. Be able to envision the overall solution for defined functional and non-functional requirements; and be able to define technologies, patterns and frameworks to materialize it. 2. Design and develop the framework of the system and be able to explain choices made. Also write and review design document explaining overall architecture, framework and high level design of the application. 3. Create, understand and validate Design and estimated effort for given module/task, and be able to justify it. 4. Be able to define in-scope, out-of-scope and taken assumptions while creating effort estimates. 5. Be able to identify and integrate well over all integration points in context of a project as well as other applications in the environment. 6. Understand the business requirements and develop data models Technical Skills: 1. Strong proficiency as a Cloud Data Engineer utilizing Power BI and Azure Data Bricks to support as well as design, develop and deploy requested updates to new and existing cloud-based services. 2. Experience with developing, implementing, monitoring and troubleshooting applications in the Azure Public Cloud. 3. Proficiency in Data Modeling and reporting 4. Design and implement database schema 5. Design and development of well documented source code. 6. Development of both unit testing and system testing scripts that will be incorporated into the QA process. 7. Automating all deployment steps with Infrastructure as Code (IAC) and Jenkins Pipeline as Code (JPaC) concepts. 8. Define guidelines and benchmarks for NFR considerations during project implementation. 9. Do required POCs to make sure that suggested design/technologies meet the requirements. . Required Experience: 5+ to 10+ years of professional experience developing SQL, Power BI, SSIS and Azure Data Bricks. 5+ to 10+ years of professional experience utilizing SQL Server for data storage in large-scale .NET solutions. Strong technical writing skills. Strong knowledge of build/deployment/unit testing tools. Highly motivated team player and a self-starter. Excellent verbal, phone, and written communication skills. Knowledge of Cloud-based architecture and concepts. Required Qualifications: Graduate or Post Graduate in Computer Science /Engineering/Science/Mathematics or related field with around 10 years of experience in executing the Data Reporting solutions Cloud Certification, preferably Azure
Posted 1 month ago
9.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Total Years of experience : 9 Years Relevant 8 years is a must Location- TVM/Kochi – Hybrid weekly 3 days (Chennai and Bangalore candidates must work from office during the Initial months ) NP- Immediate to 30 Days Salary- Maximum 45 LPA Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 4 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
2.0 - 3.0 years
7 Lacs
Mumbai
On-site
Job Title: Tableau Developer Experience: 2-3 Years Location: Mumbai, India About the Role: We are seeking a highly motivated and skilled Tableau Developer with years of proven experience to join our dynamic team in Mumbai. In this role, you will be instrumental in transforming complex data into insightful and interactive dashboards and reports using Tableau. You will work closely with business stakeholders, data analysts, and other technical teams to understand reporting requirements, develop effective data visualizations, and contribute to data-driven decision-making within the organization. Roles and Responsibilities: Dashboard Development: Design, develop, and maintain compelling and interactive Tableau dashboards and reports that meet business requirements and enhance user experience. Create various types of visualizations, including charts, graphs, maps, and tables, to effectively communicate data insights. Implement advanced Tableau features such as calculated fields, parameters, sets, groups, and Level of Detail (LOD) expressions to create sophisticated analytics. Optimize Tableau dashboards for performance and scalability, ensuring quick loading times and efficient data retrieval. Data Sourcing and Preparation: Connect to various data sources (e.g., SQL Server, Oracle, Excel, cloud-based data platforms like AWS Redshift, Google BigQuery, etc.) and extract, transform, and load (ETL) data for reporting purposes. Perform data analysis, validation, and cleansing to ensure the accuracy, completeness, and consistency of data used in reports. Collaborate with data engineers and data analysts to understand data structures, identify data gaps, and ensure data quality. Requirements Gathering & Collaboration: Work closely with business users, stakeholders, and cross-functional teams to gather and understand reporting and analytical requirements. Translate business needs into technical specifications and develop effective visualization solutions. Participate in discussions and workshops to refine requirements and propose innovative reporting approaches. Troubleshooting and Support: Diagnose and resolve issues related to data accuracy, dashboard performance, and report functionality. Provide ongoing support and maintenance for existing Tableau dashboards and reports. Assist end-users with Tableau-related queries and provide training as needed. Documentation and Best Practices: Create and maintain comprehensive documentation for Tableau dashboards, data sources, and development processes. Adhere to data visualization best practices and design principles to ensure consistency and usability across all reports. Contribute to code reviews and knowledge sharing within the team. Continuous Improvement: Stay up-to-date with the latest Tableau features, updates, and industry trends in data visualization and business intelligence. Proactively identify opportunities for improvement in existing reports and propose enhancements. Participate in an Agile development environment, adapting to changing priorities and contributing to sprint goals. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 2 years of hands-on experience as a Tableau Developer , with a strong portfolio of developed dashboards and reports. Proficiency in Tableau Desktop and Tableau Server (including publishing, managing permissions, and performance monitoring). Strong SQL skills for data extraction, manipulation, and querying from various databases. Solid understanding of data warehousing concepts, relational databases, and ETL processes. Familiarity with data visualization best practices and design principles. Excellent analytical and problem-solving skills with a keen eye for detail. Strong communication skills (verbal and written) with the ability to explain complex data insights to non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Adaptability to changing business requirements and a fast-paced environment. Additional Qualifications: Experience with other BI tools (e.g., Power BI, Qlik Sense) is a plus. Familiarity with scripting languages like Python or R for advanced data manipulation and analytics. Knowledge of cloud data platforms (e.g., AWS, Azure, GCP). Experience with Tableau Prep for data preparation. Job Types: Full-time, Permanent Pay: Up to ₹750,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 month ago
0 years
0 Lacs
Chennai
On-site
Job Description Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. Job Description - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 1 month ago
5.0 years
5 - 10 Lacs
Bengaluru
On-site
Job requisition ID :: 84163 Date: Jun 23, 2025 Location: Bengaluru Designation: Senior Consultant Entity: We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs.
Posted 1 month ago
6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as modular offerings. Skills/Specification ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:- Regression , Time Series Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. NLP, Text Mining LLM (GPTs) -OpenAI , Azure OpenAI, AWS Bed rock, Gemini, Llama, Deepseek etc (knowledge on fine tuning /custom training GPTs would be an add-on advantage). Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks - TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in Python ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
9.0 years
0 Lacs
Kerala, India
Remote
Position : AI Architect -PERMANENT Only Experience : 9+ years (Relevant 8 years is a must) Budget : Up to ₹40–45 LPA Notice Period : Immediate to 45 days Key Skills : Python, Data Science (AI/ML), SQL Location - TVM/Kochi/remote Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
7.0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
At Improzo ( Improve + Zoe; meaning Life in Greek ), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role Introduction: We are seeking an experienced and highly skilled Snowflake Data Lead/Architect to lead strategic projects focused on Pharma Commercial Data Management. This role demands a professional with 7-9 years of experience in data architecture, data management, ETL, data transformation, and governance, with an emphasis on providing scalable and secure data solutions for the pharmaceutical sector. The ideal candidate will bring a deep understanding of data architecture principles, experience with cloud platforms like Snowflake and Databricks, and a solid background in driving commercial data management projects. If you're passionate about leading impactful data initiatives, optimizing data workflows, and supporting the pharmaceutical industry's data needs, we invite you to apply. Responsibilities Key Responsibilities: Snowflake Solution Design & Development: Work closely with client stakeholders, data architects, and business analysts to understand detailed commercial data requirements and translate them into efficient Snowflake technical designs. Design, develop, and optimize complex ETL/ELT processes within Snowflake using SQL, Stored Procedures, UDFs, Streams, Tasks, and other Snowflake features. Implement data models (dimensional, star, snowflake schemas) optimized for commercial reporting, analytics, and data science use cases. Implement data governance, security, and access controls within Snowflake, adhering to strict pharmaceutical compliance regulations (e.g., HIPAA, GDPR, GxP principles). Develop and manage data sharing and collaboration solutions within Snowflake for internal and external partners. Optimize Snowflake warehouse sizing, query performance, and overall cost efficiency. Data Integration Integrate data from various commercial sources, including CRM systems (e.g., Veeva, Salesforce), sales data (e.g., IQVIA, Symphony), marketing platforms, patient services data, RWD, and other relevant datasets into Snowflake. Utilize tools like Fivetran, Azure Data Factory or custom Python scripts for data ingestion and transformation. Tech Leadership & Expertise Provide technical expertise and support for Snowflake-related issues, troubleshooting data discrepancies and performance bottlenecks. Participate in code reviews, ensuring adherence to best practices and coding standards. Mentor junior developers and contribute to the growth of the data engineering team. Data Quality, Governance & Security Implement robust data quality checks, validation rules, and reconciliation processes to ensure accuracy and reliability of commercial data. Apply and enforce data governance policies, including data lineage, metadata management, and master data management principles. Implement and maintain strict data security, access controls, and data masking techniques within Snowflake, adhering to pharmaceutical industry compliance standards (e.g., HIPAA, GDPR, GxP principles). Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field. Master's degree preferred. 7+ years of progressive experience in data warehousing, ETL/ELT development, and data engineering roles. 4+ years of hands-on, in-depth experience as a Snowflake Developer, with a proven track record of designing and implementing complex data solutions on the Snowflake platform. Expert-level proficiency in SQL for data manipulation, complex query optimization, and advanced stored procedure development within Snowflake. Strong understanding and practical experience with data modeling techniques (e.g., Dimensional Modeling, Data Vault). Experience with data integration tools for Snowflake (e.g., Fivetran, Matillion, DBT, Airflow, or custom Python-based ETL frameworks). Proficiency in at least one scripting language (e.g., Python) for data processing, API integration, and automation. Demonstrable understanding of data governance, data security, and regulatory compliance within the pharmaceutical or other highly regulated industries (e.g., GxP, HIPAA, GDPR, PII). Experience working in a client-facing or consulting environment with strong communication and presentation skills. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Preferred Qualifications Specific experience with pharmaceutical commercial data sets such as sales data (e.g., IQVIA, Symphony), CRM data (e.g., Veeva, Salesforce), claims data, patient services data, or master data management (MDM) for commercial entities. Knowledge of commercial analytics concepts and KPIs in the pharma industry (e.g., sales performance, market share, patient adherence). Experience working with cloud platforms (AWS, Azure, or GCP) and their native services for data storage and processing. Experience with version control systems (e.g., Git). Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced). Experience with data visualization tools (e.g., Tableau, Power BI, Qlik Sense) and their connectivity to Snowflake. Knowledge of Agile methodologies for managing data projects. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge tech projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: data visualization tools,data vault,azure data factory,data architecture,client-facing,data governance,data quality,data,snowflake,sql,data integration,fivetran,pharma commercial,data security,python,dimensional modeling,etl,data management
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices.
Posted 1 month ago
15.0 - 24.0 years
40 - 60 Lacs
Hyderabad, Chennai
Work from Office
Job Title: Technical Program Manager Data Engineering & Analytics Experience : 16 - 25 Years ( Relevant Years ) Salary : Based on Current CTC Location : Chennai and Hyderabad Notice Period : Immediate Joiners Only. Critical Expectations : 1 ) Candidate should have handled min 100 people Team size. 2) Should Have Min 8 Years experience into Data and AI Development 3) Should have exp in Complex Data Migration in Cloud. Position Overview: We are seeking an experienced Program Manager to lead large-scale, complex Data, BI, and AI/ML initiatives. The ideal candidate will have a deep technical understanding of modern data architectures, hands-on expertise in end-to-end solution delivery, and a proven ability to manage client relationships and multi-functional teams. This role will involve driving innovation, operational excellence, and strategic growth within Data Engineering & Analytics programs. Job Description: Responsible to manage large and complex programs encompassing multiple Data, BI and AI/ML solutions Lead the design, development, and implementation of Data Engineering & Analytics solution involving Teradata, Google Cloud Data Platform (GCP) platform, AI/ML, Qlik, Tableau etc. Work closely with clients in understanding their needs and translating them to technology solutions Provide technical leadership to solve complex business issues that translate into data analytics solutions Prepare operational/strategic reports based on defined cadences and present to steering & operational committees via WSR, MSR etc Responsible for ensuring compliance with defined service level agreements(SLA) and Key performance indicators(KPI) metrics Track and monitor the performance of services, identify areas for improvement, implement changes as needed Continuously evaluate and improve processes to ensure that services are delivered efficiently and effectively Proactive identification of issues and risks, prepare appropriate mitigation/resolution plans Foster positive work environment and build culture of automation & innovation to improve service delivery performance Developing team as coach, mentor, support, and manage team members Creating SOW, Proposals, Solution, Estimation for Data Analytics Solutions Contribute in building Data Analytics, AI/ML practice by creating case studies, POC etc Shaping opportunities and create execution approaches throughout the lifecycle of client engagements Colloborate with various functions/teams in the organization to support recruitment, hiring, onboarding and other operational activities Maintain positive relationship with all stakeholders and ensure proactive response to opportunities and challenges. Must Have Skills : Deep hands-on expertise in E2E solution life cycle management in Data Engineering and Data Management. Strong technical understanding of modern data architecture and solutions Ability to execute strategy for implementations through a roadmap and collaboration with different stakeholders Understanding of Cloud data architecture and data modeling concepts and principles, including Cloud data lakes, warehouses and marts, dimensional modeling, star schemas, real time and batch ETL/ELT Would be good to have experience in driving AI/ML, GenAI projects Experience with cloud-based data analytic platforms such as GCP, Snowflake, Azure etc Good understanding of SDLC and Agile methodologies Would be good to have a Telecom background. Must gave handled team size of 50+ Qualification: 15-20 yrs experience primarily working on Data Warehousing, BI& Analytics, Data management projects as Tech Architect, delivery, client relationship and practice roles - involving ETL, reporting, big data and analytics. Experience architecting, designing & developing Data Engineering, Business Intelligence and reporting projects Experience on working with data management solutions like Data Quality, Metadata, Master Data, Governance. Strong experience in Cloud Data migration programs Focused on value, innovation and automation led account mining Strong Interpersonal, stakeholder management and team building skills
Posted 1 month ago
7.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Microsoft Power BI Job Date: May 25, 2025 Job Requisition Id: 61407 Location: Bangalore, KA, IN Hyderabad, TG, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Power BI Professionals in the following areas : We are looking forward to hiring a Power BI Consultant who thrives on challenges and desires to make a real difference in the business world with an environment of extraordinary innovation and unprecedented growth. The position is an exciting opportunity for a self-starter who enjoys working in a fast-paced, quality-oriented, and team environment. Key Responsibilities: Desing and develop Dashboards in Power BI Ensure successful data loads, report availability and technical support. Migration of dashboards to Power BI. Should be able to perform data analysis using advance analytics tools. Excellent knowledge in Data science field and Programming Individual contributor for end-to-end reports/dashboard development and Data mining Create and maintain the technical and functional documentation. Qualifications: Power BI expert with 7+ years of experience. Excellent Knowledge of BI tools like Qlik, Power BI etc. Strong Handson experience in designing Power BI dashboard. Strong Knowledge of DAX and SQL programming Strong Knowledge of Python and Data science tools Excellent spoken and written communication. A minimum of a bachelor’s degree in IT/ Data Sciences or related fields At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 1 month ago
12.0 years
0 Lacs
Greater Hyderabad Area
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Technical Architect - AWS/SnowFlake Job Date: May 24, 2025 Job Requisition Id: 57720 Location: Hyderabad, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire SnowFlake Professionals in the following areas : Experience 12+ Years Job Description Strong communication and proactive skills, ability to lead the conversations Experience architecting and delivering solutions on AWS Hands on experience cloud warehouses like Snowflake Strong knowledge of data integrations , data modelling (Dimensional & Data Vault) & visualization practices Good knowledge of data management (Data Quality, Data Governance etc.) Zeal to pickup new technologies , do PoCs and present PoV Technical (Strong exp on atleast one item in each category) : Cloud: AWS Data Integration: Qlik Replicate, Snaplogic, Matillion & Informatica Visualization: PowerBI & Thoughtspot Storage & DBs : Snowflake, AWS Good to Have certification in Snowflake,Snaplogic At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What You'll Do SQL Development & Optimization : Write complex and optimized SQL queries, including advanced joins, subqueries, analytical functions, and stored procedures, to extract, manipulate, and analyze large datasets. Data Pipeline Management : Design, build, and support robust data pipelines to ensure timely and accurate data flow from various sources into our analytical platforms. Statistical Data Analysis : Apply a strong foundation in statistical data analysis to uncover trends, patterns, and insights from data, contributing to data-driven decision-making. Data Visualization : Work with various visualization tools (e.g., Google PLX, Tableau, Data Studio, Qlik Sense, Grafana, Splunk) to create compelling dashboards and reports that clearly communicate insights. Web Development Contribution : Leverage your experience in web development (HTML, CSS, jQuery, Bootstrap) to support data presentation layers or internal tools. Machine Learning Collaboration : Utilize your familiarity with ML tools and libraries (Scikit-learn, Pandas, NumPy, Matplotlib, NLTK) to assist in data preparation and validation for machine learning initiatives. Agile Collaboration : Work effectively within an Agile development environment, contributing to sprints and adapting to evolving requirements. Troubleshooting & Problem-Solving : Apply strong analytical and troubleshooting skills to identify and resolve data-related issues Skills Required : Expert in SQL (joins, subqueries, analytics functions, stored procedures) Experience building & supporting data pipelines Strong foundation in statistical data analysis Knowledge of visualization tools : Google PLX, Tableau, Data Studio, Qlik Sense, Grafana, Splunk, etc. Experience in web dev : HTML, CSS, jQuery, Bootstrap Familiarity with ML tools : Scikit-learn, Pandas, NumPy, Matplotlib, NLTK, and more Hands-on with Agile environments Strong analytical & troubleshooting skills Bachelor's in CS, Math, Stats, or equivalent (ref:hirist.tech)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France