Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsibilities: Provide data analytics, risk management and IT audit support during business development pursuits; e.g. proposals, cost build-ups, sales meetings Identify, prioritize and execute on high-value opportunities to improve data risk services methodologies; including developing and delivering training, whitepapers, and desktop procedures for best-practice evaluation methods by business application (prioritization on Oracle Fusion, SAP ECC and SAP S/4HANA, Microsoft D365, Workday, NetSuite and other tier 1 business applications Identify and prioritize high-value opportunities to improve audit and compliance processes through analytics and automation, particularly in areas unique to Data GRC (e.g., metadata management, master data management, data lineage capture and mapping, risk and controls design and testing, upstream and downstream data quality and accuracy validations, etc.) Responsible for developing and implementing data analytics solutions, including creating dashboards and reports. This role requires technical expertise to directly build and manage analytics. The specialist will actively engage in data analysis, build visualizations, and provide actionable insights to support decision-making. Upskill and train more junior staff on best practices and approach to data and risk management, including risk management and internal audit basics, analytics and automation. Responsible for execution and review of all work-papers and deliverables, including reporting to client stakeholders. Provide guidance to other internal and external stakeholders (clients, industry events, market events, etc.) on related data risk, analytics best practices Facilitate sessions with internal and external personnel to effectively design methodology that: a) help audit/compliance professionals learn more about the business in order to better focus attention on the areas of highest risk, and b) identify issues and potential process exceptions Manage communication with IT and/or business resources to locate internal and external data for analysis, understand data, and make data requests or direct connections to databases Champion sustainable data risk, analytics and automation design concepts Manage the development of visualization, dashboards and scripts, using agile development methodology Perform quality assurance over developer practices for data mapping, data transformations, data joining/blending, data quality, data cleansing, and other data movement related activities Provide guidance to both internal and external stakeholders on interpreting analytic results Coordinate data risk services with off-shore resources at the RSM Delivery Center in India and El Salvadore Be an active participant in local employee network groups and build relationships with RSM members across all lines of business and consulting as representing practice services and capabilities Position Requirements: Experience working with a team to provide services to numerous clients simultaneously Project and program management expertise and strong written and verbal communication skills Detail-oriented with a pro-active, inquisitive and creative approach to work, preferred to be analytics and technology inclined Experience as an auditor or supporting internal or external audit teams with fundamental understanding of enterprise risk management and compliance and/or best practice frameworks such as COSO, Sarbanes-Oxley (SOX), COBIT, etc. Understanding basic accounting, operations and auditing concepts and reporting skills, including documentation requirements Understanding and ability to describe the flow of typical business processes, covering the purchase-to-pay, order-to-cash, and record-to-report cycles, at a minimum. Understanding of automation capabilities, such as robotic process automation, machine learning, natural language processing, application programming interfacing, process mining, etc. Minimum Qualifications: Undergraduate degree in Accounting, Management Information Systems, Computer Science, or equivalent level of education Minimum of 3 years in IT audit and/or compliance with expertise in key reporting testing and experience in testing IT application controls, business process controls, and IT general controls Minimum of 3 years’ experience in technical analytics using analytics and cleansing tools such as Alteryx. Minimum of 3 years in public accounting in audit or risk advisory services capacity CPA, CISA, CIA or other related certification Preferred Qualifications: Experience with data analytics of large ERP applications such as MS D365, SAP, Oracle, NetSuite and Workday. Hands-on experience using audit-focused GRC technologies such as AuditBoard, ServiceNow, TeamMate, Idea, and WDesk. Experience using other industry standard data analysis technologies such as Alteryx, SAS, SQL, and/or Python Experience developing and/or managing dashboard solutions created using Power BI, Tableau, Qlik, or similar technologies Experience with process mining using tools like Celonis or ABBYY Timeline Experience working with automations software such as Power Automate, Automation Anywhere and UiPath. Experience working with data from cloud-based applications like Workday, NetSuite, Salesforce, Concur is a plus Business development experience is a plus Certifications in one or more data analysis technologies such as Alteryx, UiPath, Tableau, or Power BI Standards of Performance: Data stewardship - Maintain confidentiality, integrity and availability of information with your custody A self-starter with a process improvement mentality who is hands on, results-oriented, and leads by example A strong entrepreneurial spirit with the highest levels of professional and personal honestly, integrity and ethics Excellent organizational skills and the ability to prioritize multiple tasks, projects and assignments Ability to interact with all levels of client staff, including executives and senior managers Possess strong business ethics and willingness to adhere to stringent professional standards Ability to put forth additional effort to meet deadlines when necessary Ability to travel to the local office at least 3 days per week
Posted 1 month ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
We have an immediate opportunity for " Project Manager " with our client. Interested candidates send me your CV to prave.p@lancesoft.com Position: Project Manager Duration: 6 Month Location: India(Remote) Operating Environment, Framework and Boundaries, Working Relationships Lead and influence others including those more senior on best practices related to data engineering including SQL and BI reporting such as Qlik, Power BI. Experience on core banking domain will be preferred. Knowledge, Skills and Experience At least 15-20+ years of experience in Project Management and at least 2-years’ experience as Scrum Master in an Agile project. Must complete at least 20+ Agile project cycle Must have used the agile platform such as JIRA, Azure DevOps etc. SQL, database working experience in preferred. Attained the Certified Scrum Master or an equivalent qualification. Good knowledge of the commonly used agile tools and practices. Team player, Supervisory, Coaching, planning, and management skills. Strong written and communication skills. Strong inter-personal relationship. Interested candidates send me your Cv along with below details: Expected salary: Visa/ Work Permit: Notice Period: Current Location:
Posted 1 month ago
9.0 years
0 Lacs
Ernakulam, Kerala, India
On-site
Job Description Position: AI Architect -PERMANENT Only Experience: 9+ years (Relevant 8 years is a must) Notice Period: Immediate to 45 days Key Skills: Python, Data Science (AI/ML), SQL Location- TVM/Kochi/Hybrid Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Responsibilities Perform testing of IT Application Controls/ITAC/Automated controls, IPE, and Interface Controls through code reviews, IT General Controls/ITGC/GITC review covering areas such as Change Management, Access Management, Backup Management, Incident and Problem Management, SDLC, Data Migration, Batch Job scheduling/monitoring and Business Continuity and Disaster Recovery Perform Risk Assessment, identification, and Evaluation of Controls, prepare process flow diagrams and document the same in Risk & Control Matrix. Perform business process walkthrough and controls testing for IT Audits. Performing planning and executing audits, including - SOX, Internal Audits, External Audits Conducting controls assessment in manual/ automated environment Prepare/Review of Policies, Procedures, SOPs Maintain relationships with client management and the project Manager to manage expectations of service, including work products, timing, and deliverables. Demonstrate a thorough understanding of complex information systems and apply it to client situations. Use extensive knowledge of the client's business/industry to identify technological developments and evaluate impacts on the work to be performed. Coordinate effectively and efficiently with the Engagement manager and the client management keeping both constantly updated regarding project’s progress. Collaborate with other members of the engagement team to plan the engagement and develop relevant workpapers/deliverables. Perform fieldwork and share the daily progress of fieldwork, informing supervisors of engagement status. Qualifications MBA/Mtech/MS full time with minimum 3 year experience. IT Audit + SAP experience with knowledge of IT governance practices Prior IT Audit knowledge in areas of ITGC, ITAC (application/automated controls) SOX 404, SOC-1 and SOC-2 Audits Good to have knowledge of other IT regulations, standards and benchmarks used by the IT industry (e.g. NIST, PCI-DSS, ITIL, OWASP, SOX, COBIT, SSAE18/ISAE 3402 etc.) Technical Knowledge of IT Audit Tools with excellent knowledge of IT Audit process and methodology Exposure to Risk Management and Governance Frameworks/ Systems will be an added advantage Exposure to ERP systems will be added advantage Strong project management, communication (written and verbal) and presentation skills Knowledge of security measures and auditing practices within various applications, operating systems, and databases. Strong self-directed work habits, exhibiting initiative, drive, creativity, maturity, self-assurance, and professionalism Preferred Certifications – CISA/CISSP//CISM Exposure to automation Data Analytics tools such as QlikView/Qlik sense, ACL, Power BI will be an advantage Proficiency with Microsoft Word, Excel, Visio, and other MS Office tools Equal Opportunity Employer KPMG India KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we’ll give you what you need to make it happen. It won’t always be easy, growing takes grit. But at ABB, you’ll never run alone. Run what runs the world. This Position Reports To HR Operations Team Lead Your Role And Responsibilities People Analytics organization is a Global function that works with various ABB business divisions and countries delivering operational and expert services to ABB’s HR community. We aim to unlock the potential of data to help ABB business leaders and managers take better decisions which will enable us to build a more sustainable and resource-efficient future in electrification and automation. In this role, you will have the opportunity to partner with senior stakeholders to help them conceptualize KPIs in areas of Talent Performance Culture and build statistical and analytics solutions from scratch, and have a bias towards creating clean, effective, user-focused visualizations to deliver actionable insights and analysis using technologies that would vary based on purpose from Python, Snowflake, Power BI, Advanced Excel, VBA, or any other new technology. The work model for the role is: Hybrid This role is contributing to the People Analytics function supporting various business function based out in Bangalore. You Will Be Mainly Accountable For Capably interacting and managing global ABB leadership to seek and provide meaningful and actionable insights in all interactions Responsible for on time delivery of actionable insights by requirement gathering, data extraction to reporting/ presenting the findings in IC role or with the team as per project needs You are to be constantly on the looking out for ways to enhance value for your respective stakeholders/clients Developing frameworks, plug n play solutions using diagnostic, predictive and machine learning techniques on Snowflake/ Python Executing strategic projects to help ABB improve excellence in people, performance, and culture Qualifications For The Role Bachelor’s/master’s degree in applied Statistics/Mathematics, Engineering, Operations Research or related field. At least 3 - 5 years of experience in consulting, shared services or software development with proficient data analysis techniques using technologies like Excel, VBA Scripting, Python, Snowflake, Understanding of SAP/ Workday HCM/ Snowflake system is preferred. Candidate should have a motivated mindset with advanced quantitative skills, with an ability to work on large datasets. The candidate should be able to generate actionable insights from data analysis that translate to valuable business decisions for the client. Capable EDA practitioner with extensive knowledge in advanced Excel functionalities Experience in designing and maintaining dashboards/ reports providing diagnostic and forecasting view using VBA, PowerBI, Qlik, Tableau Collaborative worker with excellent collaboration skills required to work in a global virtual work environment: team-oriented, self-motivated and able to lead small to mid-size projects What's In It For You We empower you to take initiative, challenge ideas, and lead with confidence. You’ll grow through meaningful work, continuous learning, and support that’s tailored to your goals. Every idea you share and every action you take contributes to something bigger. More About Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics #ABBCareers #RunwithABB #Runwhatrunstheworld We value people from different backgrounds. Could this be your story? Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. #RunWhatRunsTheWorld Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website https://global.abb/group/en/careers and apply. Please refer to detailed recruitment fraud caution notice using the link https://global.abb/group/en/careers/how-to-apply/fraud-warning. 96249028
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we’ll give you what you need to make it happen. It won’t always be easy, growing takes grit. But at ABB, you’ll never run alone. Run what runs the world. This Position Reports To HR Operations Team Lead Your Role And Responsibilities People Analytics organization is a Global function that works with various ABB business divisions and countries delivering operational and expert services to ABB’s HR community. We aim to unlock the potential of data to help ABB business leaders and managers take better decisions which will enable us to build a more sustainable and resource-efficient future in electrification and automation. In this role, you will have the opportunity to partner with senior stakeholders to help them conceptualize KPIs in areas of Talent Performance Culture and build statistical and analytics solutions from scratch, and have a bias towards creating clean, effective, user-focused visualizations to deliver actionable insights and analysis using technologies that would vary based on purpose from Python, Snowflake, Power BI, Advanced Excel, VBA, or any other new technology. The work model for the role is: Hybrid This role is contributing to the People Analytics function supporting various business function based out in Bangalore. You Will Be Mainly Accountable For Capably interacting and managing global ABB leadership to seek and provide meaningful and actionable insights in all interactions Responsible for on time delivery of actionable insights by requirement gathering, data extraction to reporting/ presenting the findings in IC role or with the team as per project needs You are to be constantly on the looking out for ways to enhance value for your respective stakeholders/clients Developing frameworks, plug n play solutions using diagnostic, predictive and machine learning techniques on Snowflake/ Python Executing strategic projects to help ABB improve excellence in people, performance, and culture Qualifications For The Role Bachelor’s/master’s degree in applied Statistics/Mathematics, Engineering, Operations Research or related field. At least 3 - 5 years of experience in consulting, shared services or software development with proficient data analysis techniques using technologies like Excel, VBA Scripting, Python, Snowflake, Understanding of SAP/ Workday HCM/ Snowflake system is preferred. Candidate should have a motivated mindset with advanced quantitative skills, with an ability to work on large datasets. The candidate should be able to generate actionable insights from data analysis that translate to valuable business decisions for the client. Capable EDA practitioner with extensive knowledge in advanced Excel functionalities Experience in designing and maintaining dashboards/ reports providing diagnostic and forecasting view using VBA, PowerBI, Qlik, Tableau Collaborative worker with excellent collaboration skills required to work in a global virtual work environment: team-oriented, self-motivated and able to lead small to mid-size projects What's In It For You We empower you to take initiative, challenge ideas, and lead with confidence. You’ll grow through meaningful work, continuous learning, and support that’s tailored to your goals. Every idea you share and every action you take contributes to something bigger. More About Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics #ABBCareers #RunwithABB #Runwhatrunstheworld We value people from different backgrounds. Could this be your story? Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. #RunWhatRunsTheWorld Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website https://global.abb/group/en/careers and apply. Please refer to detailed recruitment fraud caution notice using the link https://global.abb/group/en/careers/how-to-apply/fraud-warning. 96249028
Posted 1 month ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Perform development and support activities for the data warehousing domain using ETL tools and technologies. Understand high-level design, application interface design, and build low-level design. Perform application analysis and propose technical solutions for application enhancement or resolve production issues. Perform development and deployment tasks, including coding, unit testing, and deployment. Create necessary documentation for all project deliverable phases. Handle production issues (Tier 2 support, weekend on-call rotation) to resolve production issues and ensure SLAs are met. Technical Skills: Mandatory: Working experience in Azure Databricks/PySpark. Expert knowledge in Oracle/SQL with the ability to write complex SQL/PL-SQL and performance tune. 2+ years of experience in Snowflake. 2+ years of hands-on experience in Spark or Databricks to build data pipelines. Strong experience with cloud technologies. 1+ years of hands-on experience in development, performance tuning, and loading into Snowflake. Experience working with Azure Repos or GitHub. 1+ years of hands-on experience with Azure DevOps, GitHub, or any other DevOps tool. Hands-on experience in Unix and advanced Unix shell scripting. Open to working in shifts. Good to Have: Willingness to learn all data warehousing technologies and work outside of the comfort zone in other ETL technologies (Oracle, Qlik Replicate, Golden Gate, Hadoop). Hands-on working experience is a plus. Knowledge of job schedulers. Behavioral Skills: Eagerness and hunger to learn. Good problem-solving and decision-making skills. Good communication skills within the team, site, and with the customer. Ability to stretch working hours when necessary to support business needs. Ability to work independently and drive issues to closure. Consult when necessary with relevant parties, raise timely risks. Effectively handle multiple and complex work assignments while consistently delivering high-quality work. Matrix is a global, dynamic, and fast-growing leader in technical consultancy and technology services, employing over 13,000 professionals worldwide. Since its founding in 2001, Matrix has expanded through strategic acquisitions and significant ventures, cementing its position as a pioneer in the tech industry. We specialize in developing and implementing cutting-edge technologies, software solutions, and products. Our offerings include infrastructure and consulting services, IT outsourcing, offshore solutions, training, and assimilation. Matrix also proudly represents some of the world's leading software vendors. With extensive experience spanning both private and public sectors—such as Finance, Telecom, Healthcare, Hi-Tech, Education, Defense, and Security—Matrix serves a distinguished clientele in Israel and an ever-expanding global customer base. Our success stems from a team of talented, creative, and dedicated professionals who are passionate about delivering innovative solutions. We prioritize attracting and nurturing top talent, recognizing that every employee’s contribution is essential to our success. Matrix is committed to fostering a collaborative and inclusive work environment where learning, growth, and shared success thrive. Join the winning team at Matrix! Here, you’ll find a challenging yet rewarding career, competitive compensation and benefits, and opportunities to be part of a highly respected organization—all while having fun along the way. To Learn More, Visit: www.matrix-ifs.com EQUAL OPPORTUNITY EMPLOYER: Matrix is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind. Matrix is committed to the principle of equal employment opportunity for all employees, providing employees with a work environment free of discrimination and harassment. All employment decisions at Matrix are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, family or parental status, or any other status protected by the laws or regulations in our locations. Matrix will not tolerate discrimination or harassment based on any of these characteristics. Matrix encourages applicants of all ages .
Posted 1 month ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position Business Analyst BU Financial Management- Business Finance (Corporate) Objective To prepare and provide various reports and insights to aide senior management in strategic decision making Responsibilities Preparing and analyzing profitability statements, trends in Business performance across products / teams / geography for Management decision making. Engage in product profitability with various product groups such as Trade and CMS Services, Treasury, etc. Preparing & analyzing Business and Product KPI development and measurement. Prepare Budgets for Corporate Business segments & Products in co-ordination with various stakeholders. Projects – FTP Implementation as per policy, Automation of dashboards, data enhancements and process improvements. Prepare theme-based analysis for Senior Management & other stakeholders. Manage and resolve Internal Audit queries Essential competencies 2+ years’ experience preferably from banking industry Conceptual understanding of wholesale banking products Experience in profitability analysis of Banking products in Corporate Lending, Trade Services, Treasury etc Proficiency in Microsoft Excel & exposure to MIS automation needs etc. Proficiency in providing accurate information with insights in a fast-paced & decision centric environment. Familiarity of reporting tools – Qlik, Tableau, etc Good communication and presentation skills Qualifications MBA/CA
Posted 1 month ago
6.0 - 11.0 years
15 - 27 Lacs
Bengaluru
Hybrid
Primary Responsibilities: Develop visual reports, dashboards and KPI scorecards using Power BI desktop. Build Analysis Services reporting models. Connect to data sources, importing data and transforming data for Business Intelligence. Implement row level security on data and understand application security layer models in Power BI. Integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation. Use advance level calculations on the data set. Design and develop Azure-based data centric application to manage large healthcare data application Design, build, test and deploy streaming pipelines for data processing in real time and at scale Create ETL packages Make use of Azure cloud services in ingestion and data processing Own feature development using Microsoft Azure native services like App Service, Azure Function, Azure Storage, Service Bus queues, Event Hubs, Event Grid, Application Gateway, Azure SQL, Azure DataBricks, etc Identify opportunities to fine-tune and optimize applications running on Microsoft Azure, cost reduction, adoption of best cloud practices, data and application security covering scalability and high availability Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects of Microsoft Azure Focus on automation using Infrastructure as a Code (IaaC), Jenkins, Azure DevOps, Terraform, etc. Communicate effectively with other engineers and QA Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Senior Cloud Data Engineer Position with about 7+ Years of hands-on technical Experience in the Data processing, reporting and Cloud technologies. Working Knowledge of executing the projects in the Agile Methodologies. 1. Required Skills 1. Be able to envision the overall solution for defined functional and non-functional requirements; and be able to define technologies, patterns and frameworks to materialize it. 2. Design and develop the framework of the system and be able to explain choices made. Also write and review design document explaining overall architecture, framework and high level design of the application. 3. Create, understand and validate Design and estimated effort for given module/task, and be able to justify it. 4. Be able to define in-scope, out-of-scope and taken assumptions while creating effort estimates. 5. Be able to identify and integrate well over all integration points in context of a project as well as other applications in the environment. 6. Understand the business requirements and develop data models Technical Skills: 1. Strong proficiency as a Cloud Data Engineer utilizing Power BI and Azure Data Bricks to support as well as design, develop and deploy requested updates to new and existing cloud-based services. 2. Experience with developing, implementing, monitoring and troubleshooting applications in the Azure Public Cloud. 3. Proficiency in Data Modeling and reporting 4. Design and implement database schema 5. Design and development of well documented source code. 6. Development of both unit testing and system testing scripts that will be incorporated into the QA process. 7. Automating all deployment steps with Infrastructure as Code (IAC) and Jenkins Pipeline as Code (JPaC) concepts. 8. Define guidelines and benchmarks for NFR considerations during project implementation. 9. Do required POCs to make sure that suggested design/technologies meet the requirements. . Required Experience: 5+ to 10+ years of professional experience developing SQL, Power BI, SSIS and Azure Data Bricks. 5+ to 10+ years of professional experience utilizing SQL Server for data storage in large-scale .NET solutions. Strong technical writing skills. Strong knowledge of build/deployment/unit testing tools. Highly motivated team player and a self-starter. Excellent verbal, phone, and written communication skills. Knowledge of Cloud-based architecture and concepts. Required Qualifications: Graduate or Post Graduate in Computer Science /Engineering/Science/Mathematics or related field with around 10 years of experience in executing the Data Reporting solutions Cloud Certification, preferably Azure
Posted 1 month ago
9.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Total Years of experience : 9 Years Relevant 8 years is a must Location- TVM/Kochi – Hybrid weekly 3 days (Chennai and Bangalore candidates must work from office during the Initial months ) NP- Immediate to 30 Days Salary- Maximum 45 LPA Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 4 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
2.0 - 3.0 years
7 Lacs
Mumbai
On-site
Job Title: Tableau Developer Experience: 2-3 Years Location: Mumbai, India About the Role: We are seeking a highly motivated and skilled Tableau Developer with years of proven experience to join our dynamic team in Mumbai. In this role, you will be instrumental in transforming complex data into insightful and interactive dashboards and reports using Tableau. You will work closely with business stakeholders, data analysts, and other technical teams to understand reporting requirements, develop effective data visualizations, and contribute to data-driven decision-making within the organization. Roles and Responsibilities: Dashboard Development: Design, develop, and maintain compelling and interactive Tableau dashboards and reports that meet business requirements and enhance user experience. Create various types of visualizations, including charts, graphs, maps, and tables, to effectively communicate data insights. Implement advanced Tableau features such as calculated fields, parameters, sets, groups, and Level of Detail (LOD) expressions to create sophisticated analytics. Optimize Tableau dashboards for performance and scalability, ensuring quick loading times and efficient data retrieval. Data Sourcing and Preparation: Connect to various data sources (e.g., SQL Server, Oracle, Excel, cloud-based data platforms like AWS Redshift, Google BigQuery, etc.) and extract, transform, and load (ETL) data for reporting purposes. Perform data analysis, validation, and cleansing to ensure the accuracy, completeness, and consistency of data used in reports. Collaborate with data engineers and data analysts to understand data structures, identify data gaps, and ensure data quality. Requirements Gathering & Collaboration: Work closely with business users, stakeholders, and cross-functional teams to gather and understand reporting and analytical requirements. Translate business needs into technical specifications and develop effective visualization solutions. Participate in discussions and workshops to refine requirements and propose innovative reporting approaches. Troubleshooting and Support: Diagnose and resolve issues related to data accuracy, dashboard performance, and report functionality. Provide ongoing support and maintenance for existing Tableau dashboards and reports. Assist end-users with Tableau-related queries and provide training as needed. Documentation and Best Practices: Create and maintain comprehensive documentation for Tableau dashboards, data sources, and development processes. Adhere to data visualization best practices and design principles to ensure consistency and usability across all reports. Contribute to code reviews and knowledge sharing within the team. Continuous Improvement: Stay up-to-date with the latest Tableau features, updates, and industry trends in data visualization and business intelligence. Proactively identify opportunities for improvement in existing reports and propose enhancements. Participate in an Agile development environment, adapting to changing priorities and contributing to sprint goals. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 2 years of hands-on experience as a Tableau Developer , with a strong portfolio of developed dashboards and reports. Proficiency in Tableau Desktop and Tableau Server (including publishing, managing permissions, and performance monitoring). Strong SQL skills for data extraction, manipulation, and querying from various databases. Solid understanding of data warehousing concepts, relational databases, and ETL processes. Familiarity with data visualization best practices and design principles. Excellent analytical and problem-solving skills with a keen eye for detail. Strong communication skills (verbal and written) with the ability to explain complex data insights to non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Adaptability to changing business requirements and a fast-paced environment. Additional Qualifications: Experience with other BI tools (e.g., Power BI, Qlik Sense) is a plus. Familiarity with scripting languages like Python or R for advanced data manipulation and analytics. Knowledge of cloud data platforms (e.g., AWS, Azure, GCP). Experience with Tableau Prep for data preparation. Job Types: Full-time, Permanent Pay: Up to ₹750,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 month ago
0 years
0 Lacs
Chennai
On-site
Job Description Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. Job Description - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 1 month ago
5.0 years
5 - 10 Lacs
Bengaluru
On-site
Job requisition ID :: 84163 Date: Jun 23, 2025 Location: Bengaluru Designation: Senior Consultant Entity: We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs.
Posted 1 month ago
6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as modular offerings. Skills/Specification ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:- Regression , Time Series Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. NLP, Text Mining LLM (GPTs) -OpenAI , Azure OpenAI, AWS Bed rock, Gemini, Llama, Deepseek etc (knowledge on fine tuning /custom training GPTs would be an add-on advantage). Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks - TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in Python ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
9.0 years
0 Lacs
Kerala, India
Remote
Position : AI Architect -PERMANENT Only Experience : 9+ years (Relevant 8 years is a must) Budget : Up to ₹40–45 LPA Notice Period : Immediate to 45 days Key Skills : Python, Data Science (AI/ML), SQL Location - TVM/Kochi/remote Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns
Posted 1 month ago
7.0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
At Improzo ( Improve + Zoe; meaning Life in Greek ), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role Introduction: We are seeking an experienced and highly skilled Snowflake Data Lead/Architect to lead strategic projects focused on Pharma Commercial Data Management. This role demands a professional with 7-9 years of experience in data architecture, data management, ETL, data transformation, and governance, with an emphasis on providing scalable and secure data solutions for the pharmaceutical sector. The ideal candidate will bring a deep understanding of data architecture principles, experience with cloud platforms like Snowflake and Databricks, and a solid background in driving commercial data management projects. If you're passionate about leading impactful data initiatives, optimizing data workflows, and supporting the pharmaceutical industry's data needs, we invite you to apply. Responsibilities Key Responsibilities: Snowflake Solution Design & Development: Work closely with client stakeholders, data architects, and business analysts to understand detailed commercial data requirements and translate them into efficient Snowflake technical designs. Design, develop, and optimize complex ETL/ELT processes within Snowflake using SQL, Stored Procedures, UDFs, Streams, Tasks, and other Snowflake features. Implement data models (dimensional, star, snowflake schemas) optimized for commercial reporting, analytics, and data science use cases. Implement data governance, security, and access controls within Snowflake, adhering to strict pharmaceutical compliance regulations (e.g., HIPAA, GDPR, GxP principles). Develop and manage data sharing and collaboration solutions within Snowflake for internal and external partners. Optimize Snowflake warehouse sizing, query performance, and overall cost efficiency. Data Integration Integrate data from various commercial sources, including CRM systems (e.g., Veeva, Salesforce), sales data (e.g., IQVIA, Symphony), marketing platforms, patient services data, RWD, and other relevant datasets into Snowflake. Utilize tools like Fivetran, Azure Data Factory or custom Python scripts for data ingestion and transformation. Tech Leadership & Expertise Provide technical expertise and support for Snowflake-related issues, troubleshooting data discrepancies and performance bottlenecks. Participate in code reviews, ensuring adherence to best practices and coding standards. Mentor junior developers and contribute to the growth of the data engineering team. Data Quality, Governance & Security Implement robust data quality checks, validation rules, and reconciliation processes to ensure accuracy and reliability of commercial data. Apply and enforce data governance policies, including data lineage, metadata management, and master data management principles. Implement and maintain strict data security, access controls, and data masking techniques within Snowflake, adhering to pharmaceutical industry compliance standards (e.g., HIPAA, GDPR, GxP principles). Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field. Master's degree preferred. 7+ years of progressive experience in data warehousing, ETL/ELT development, and data engineering roles. 4+ years of hands-on, in-depth experience as a Snowflake Developer, with a proven track record of designing and implementing complex data solutions on the Snowflake platform. Expert-level proficiency in SQL for data manipulation, complex query optimization, and advanced stored procedure development within Snowflake. Strong understanding and practical experience with data modeling techniques (e.g., Dimensional Modeling, Data Vault). Experience with data integration tools for Snowflake (e.g., Fivetran, Matillion, DBT, Airflow, or custom Python-based ETL frameworks). Proficiency in at least one scripting language (e.g., Python) for data processing, API integration, and automation. Demonstrable understanding of data governance, data security, and regulatory compliance within the pharmaceutical or other highly regulated industries (e.g., GxP, HIPAA, GDPR, PII). Experience working in a client-facing or consulting environment with strong communication and presentation skills. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Preferred Qualifications Specific experience with pharmaceutical commercial data sets such as sales data (e.g., IQVIA, Symphony), CRM data (e.g., Veeva, Salesforce), claims data, patient services data, or master data management (MDM) for commercial entities. Knowledge of commercial analytics concepts and KPIs in the pharma industry (e.g., sales performance, market share, patient adherence). Experience working with cloud platforms (AWS, Azure, or GCP) and their native services for data storage and processing. Experience with version control systems (e.g., Git). Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced). Experience with data visualization tools (e.g., Tableau, Power BI, Qlik Sense) and their connectivity to Snowflake. Knowledge of Agile methodologies for managing data projects. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge tech projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: data visualization tools,data vault,azure data factory,data architecture,client-facing,data governance,data quality,data,snowflake,sql,data integration,fivetran,pharma commercial,data security,python,dimensional modeling,etl,data management
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices.
Posted 1 month ago
15.0 - 24.0 years
40 - 60 Lacs
Hyderabad, Chennai
Work from Office
Job Title: Technical Program Manager Data Engineering & Analytics Experience : 16 - 25 Years ( Relevant Years ) Salary : Based on Current CTC Location : Chennai and Hyderabad Notice Period : Immediate Joiners Only. Critical Expectations : 1 ) Candidate should have handled min 100 people Team size. 2) Should Have Min 8 Years experience into Data and AI Development 3) Should have exp in Complex Data Migration in Cloud. Position Overview: We are seeking an experienced Program Manager to lead large-scale, complex Data, BI, and AI/ML initiatives. The ideal candidate will have a deep technical understanding of modern data architectures, hands-on expertise in end-to-end solution delivery, and a proven ability to manage client relationships and multi-functional teams. This role will involve driving innovation, operational excellence, and strategic growth within Data Engineering & Analytics programs. Job Description: Responsible to manage large and complex programs encompassing multiple Data, BI and AI/ML solutions Lead the design, development, and implementation of Data Engineering & Analytics solution involving Teradata, Google Cloud Data Platform (GCP) platform, AI/ML, Qlik, Tableau etc. Work closely with clients in understanding their needs and translating them to technology solutions Provide technical leadership to solve complex business issues that translate into data analytics solutions Prepare operational/strategic reports based on defined cadences and present to steering & operational committees via WSR, MSR etc Responsible for ensuring compliance with defined service level agreements(SLA) and Key performance indicators(KPI) metrics Track and monitor the performance of services, identify areas for improvement, implement changes as needed Continuously evaluate and improve processes to ensure that services are delivered efficiently and effectively Proactive identification of issues and risks, prepare appropriate mitigation/resolution plans Foster positive work environment and build culture of automation & innovation to improve service delivery performance Developing team as coach, mentor, support, and manage team members Creating SOW, Proposals, Solution, Estimation for Data Analytics Solutions Contribute in building Data Analytics, AI/ML practice by creating case studies, POC etc Shaping opportunities and create execution approaches throughout the lifecycle of client engagements Colloborate with various functions/teams in the organization to support recruitment, hiring, onboarding and other operational activities Maintain positive relationship with all stakeholders and ensure proactive response to opportunities and challenges. Must Have Skills : Deep hands-on expertise in E2E solution life cycle management in Data Engineering and Data Management. Strong technical understanding of modern data architecture and solutions Ability to execute strategy for implementations through a roadmap and collaboration with different stakeholders Understanding of Cloud data architecture and data modeling concepts and principles, including Cloud data lakes, warehouses and marts, dimensional modeling, star schemas, real time and batch ETL/ELT Would be good to have experience in driving AI/ML, GenAI projects Experience with cloud-based data analytic platforms such as GCP, Snowflake, Azure etc Good understanding of SDLC and Agile methodologies Would be good to have a Telecom background. Must gave handled team size of 50+ Qualification: 15-20 yrs experience primarily working on Data Warehousing, BI& Analytics, Data management projects as Tech Architect, delivery, client relationship and practice roles - involving ETL, reporting, big data and analytics. Experience architecting, designing & developing Data Engineering, Business Intelligence and reporting projects Experience on working with data management solutions like Data Quality, Metadata, Master Data, Governance. Strong experience in Cloud Data migration programs Focused on value, innovation and automation led account mining Strong Interpersonal, stakeholder management and team building skills
Posted 1 month ago
7.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Microsoft Power BI Job Date: May 25, 2025 Job Requisition Id: 61407 Location: Bangalore, KA, IN Hyderabad, TG, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Power BI Professionals in the following areas : We are looking forward to hiring a Power BI Consultant who thrives on challenges and desires to make a real difference in the business world with an environment of extraordinary innovation and unprecedented growth. The position is an exciting opportunity for a self-starter who enjoys working in a fast-paced, quality-oriented, and team environment. Key Responsibilities: Desing and develop Dashboards in Power BI Ensure successful data loads, report availability and technical support. Migration of dashboards to Power BI. Should be able to perform data analysis using advance analytics tools. Excellent knowledge in Data science field and Programming Individual contributor for end-to-end reports/dashboard development and Data mining Create and maintain the technical and functional documentation. Qualifications: Power BI expert with 7+ years of experience. Excellent Knowledge of BI tools like Qlik, Power BI etc. Strong Handson experience in designing Power BI dashboard. Strong Knowledge of DAX and SQL programming Strong Knowledge of Python and Data science tools Excellent spoken and written communication. A minimum of a bachelor’s degree in IT/ Data Sciences or related fields At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 1 month ago
12.0 years
0 Lacs
Greater Hyderabad Area
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Technical Architect - AWS/SnowFlake Job Date: May 24, 2025 Job Requisition Id: 57720 Location: Hyderabad, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire SnowFlake Professionals in the following areas : Experience 12+ Years Job Description Strong communication and proactive skills, ability to lead the conversations Experience architecting and delivering solutions on AWS Hands on experience cloud warehouses like Snowflake Strong knowledge of data integrations , data modelling (Dimensional & Data Vault) & visualization practices Good knowledge of data management (Data Quality, Data Governance etc.) Zeal to pickup new technologies , do PoCs and present PoV Technical (Strong exp on atleast one item in each category) : Cloud: AWS Data Integration: Qlik Replicate, Snaplogic, Matillion & Informatica Visualization: PowerBI & Thoughtspot Storage & DBs : Snowflake, AWS Good to Have certification in Snowflake,Snaplogic At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What You'll Do SQL Development & Optimization : Write complex and optimized SQL queries, including advanced joins, subqueries, analytical functions, and stored procedures, to extract, manipulate, and analyze large datasets. Data Pipeline Management : Design, build, and support robust data pipelines to ensure timely and accurate data flow from various sources into our analytical platforms. Statistical Data Analysis : Apply a strong foundation in statistical data analysis to uncover trends, patterns, and insights from data, contributing to data-driven decision-making. Data Visualization : Work with various visualization tools (e.g., Google PLX, Tableau, Data Studio, Qlik Sense, Grafana, Splunk) to create compelling dashboards and reports that clearly communicate insights. Web Development Contribution : Leverage your experience in web development (HTML, CSS, jQuery, Bootstrap) to support data presentation layers or internal tools. Machine Learning Collaboration : Utilize your familiarity with ML tools and libraries (Scikit-learn, Pandas, NumPy, Matplotlib, NLTK) to assist in data preparation and validation for machine learning initiatives. Agile Collaboration : Work effectively within an Agile development environment, contributing to sprints and adapting to evolving requirements. Troubleshooting & Problem-Solving : Apply strong analytical and troubleshooting skills to identify and resolve data-related issues Skills Required : Expert in SQL (joins, subqueries, analytics functions, stored procedures) Experience building & supporting data pipelines Strong foundation in statistical data analysis Knowledge of visualization tools : Google PLX, Tableau, Data Studio, Qlik Sense, Grafana, Splunk, etc. Experience in web dev : HTML, CSS, jQuery, Bootstrap Familiarity with ML tools : Scikit-learn, Pandas, NumPy, Matplotlib, NLTK, and more Hands-on with Agile environments Strong analytical & troubleshooting skills Bachelor's in CS, Math, Stats, or equivalent (ref:hirist.tech)
Posted 1 month ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description What We Do At Goldman Sachs, our Engineers don’t just make things – we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning to continuously turn data into action. Engineering is at the critical center of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here. Who We Look For Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile and more. We look for creative collaborators who evolve, adapt to change and thrive in a fast-paced global environment. Responsibilities & Qualifications BTech/BE/MTech in Computer Science with minimum 6 years of experience Technologies Hands-on developer experience with an awareness of below skills Design and development of web based application using Java/J2EE, REST, Relational and NOSQL databases. Cloud Technologies - AWS , Azure Databases – DB2, Sybase IQ, Mongo DB Programming - Java, Python, Shell script, Terraform Messaging - Kafka, RMQ Frameworks - Spring boot, Spring cloud Site Reliability Engineering (SRE) - Prometheus, Grafana UI – ReactJS, Visualization libraries BI Tools - Alteryx, Tableau, Qlik Sense, Power BI Container – Docker, Kubernetes Preferred Qualifications 8+ year of industry experience with focus on Technical Architecture, Project management and leadership skills in a fast paced Agile environment. Stakeholder management - experience working with business or clients to transform requirements, provide updates and manage expectations Strong Analytical and Problem solving skills Experience with continuous delivery and deployment practices– preferred experience on Git pipelines. Advocate of strong engineering practices and required to run and maintain a robust engineering plant with SRE, Operational Readiness Experience working with cloud infrastructure and SaaS solutions in a hybrid Cloud environment.
Posted 1 month ago
5.0 years
0 Lacs
Delhi, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Responsibilities Key Responsibilities: Lead the development, design, and implementation of QlikSense applications. Integrate QlikSense with WhatsApp & Qlik Alerting for real-time data notifications and alerts. Develop and implement predictive analytics models using QlikSense. Create and maintain complex data models, dashboards, and visualizations. Perform data extraction, transformation, and loading (ETL) from various data sources. Collaborate with business users to gather requirements and translate them into technical solutions. Optimize QlikSense applications for performance and usability. Provide mentorship and guidance to junior developers. Ensure data accuracy and integrity in all QlikSense applications. Stay updated with the latest QlikSense features, predictive analytics techniques, and best practices. Required Skills and Qualifications: 5+ years of experience in QlikSense development. Strong knowledge of QlikSense scripting, data modeling, and dashboard development. Experience with WhatsApp API integration. Proficiency in Qlik NPrinting. Proficiency in predictive analytics and machine learning techniques. Advanced SQL skills and experience with relational databases (e.g., MS SQL Server, Oracle). Experience with data integration and ETL processes. Excellent problem-solving and analytical skills. Strong communication skills to interact with business users and stakeholders. Ability to work independently and as part of a team. QlikSense certification is a plus. Mandatory Skill Sets Qlikview Preferred Skill Sets Qlikview Years Of Experience Required 4-7 Education Qualification Btech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills QlikView Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 1 month ago
7.0 years
2 - 8 Lacs
Gurgaon
On-site
As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute: As Manager, Data Analytics and Automation – Internal Audit, you will lead the data analytics and automation function within the Internal Audit department at Ciena. You will be responsible for managing a team of data analysts and driving the strategic use of data to enhance audit effectiveness, risk assessment, and business process improvement. Additionally, you will partner closely with audit leadership and business stakeholders to embed advanced analytics, automation, artificial intelligence, continuous auditing and monitoring, and digital tools into all phases of the audit lifecycle, supporting the company’s mission of operational excellence and continuous innovation. Key Responsibilities: Lead, mentor, and develop a team of data analysts, fostering a culture of innovation, collaboration, and continuous learning, specifically building Artificial Intelligence (AI) and Machine Learning (ML) capabilities within the team. Develop and champion the strategic roadmap for integrating AI/ML into internal audit processes, including the identification of high-impact use cases for data analytics, continuous auditing and risk assessment. Oversee the design and execution of data analytics strategies, incorporating AI/ML techniques and continuous auditing approaches, that support audit planning, execution, and reporting. Collaborate with the internal audit team and business leaders to identify opportunities for data-driven insights, automation, and process improvements. Ensure the integrity, quality, and security of data used in audit analytics, maintaining compliance with company policies and regulatory requirements. Translate complex AI/ML findings, continuous auditing outputs, and data analytics observations into actionable insights and clear recommendations for audit leadership and business stakeholders, fostering data-driven decision-making Develop and maintain advanced dashboards, visualizations, and analytical models to communicate key findings and trends to stakeholders at all levels. Drive the adoption of emerging technologies (AI/ML, automation) within the audit function, recommending and implementing innovative solutions. Manage multiple concurrent projects, allocating resources and setting priorities to deliver high-impact results on time. Stay current with industry trends, regulatory changes, and best practices in data analytics, automation, internal audit, risk management, and the application of AI in these fields. Support the integration of data analytics, automation, and AI into the audit methodology, ensuring each are embedded in risk assessment, control testing, and reporting. Build strong relationships with IT, Finance, and other business partners to facilitate access to data and alignment on analytics initiatives. The Must Haves: Education: Bachelor’s degree in Data Science, Computer Science, Information Systems, Business Analytics, or a related field. Master’s degree preferred. Experience: 7+ years of progressive experience in data analytics and automation, with at least 2 years in a leadership or managerial role, preferably within internal audit, risk management, or a high-tech environment. Advanced proficiency with data analytics tools, including SQL (Snowflake experience preferred), Python, R, Alteryx, or similar tools, and data visualization platforms (e.g., Power BI, Tableau, Qlik). Experience extracting and transforming data from large data lakes to derive actionable insights using advanced analytical techniques is essential. Strong understanding of ERP systems (Oracle, SAP), cloud platforms, and business process data flows. Demonstrated ability to lead teams, manage complex projects, and deliver data-driven insights that influence business decisions. In-depth knowledge of internal controls, audit methodologies, and risk management frameworks is a strong asset. Excellent communication, stakeholder management, and problem-solving skills. High ethical standards, attention to detail, and commitment to confidentiality and data security. Assets: Experience working in a global, high-tech, or rapidly evolving business environment. Familiarity with regulatory requirements (SOX, GDPR, CCPA) and audit standards. Innovative mindset and passion for driving digital transformation within the audit function. Proven experience in designing and implementing automation and AI/ML solutions or continuous auditing programs within an internal audit, risk management, or compliance function This role is ideal for a dynamic leader who combines technical expertise with strategic vision, ready to elevate the impact of data analytics, AI, and automation in internal audit at a leading technology company #LI-MP2 Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.
Posted 1 month ago
0 years
10 - 10 Lacs
Bengaluru
On-site
Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future? If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Introduction Your new team belongs to Production Logistics, which is a part of Group Trucks Operations (GTO). We are anorganization of approximately 650 employees, globally connected to deliver logistics solutions with world classoperational excellence. We ensure that transportation is purchased, packaging is made available at our supplier,material is transported to our production facilities and vehicles are distributed to our customers on time. We designand optimize the Production Logistics supply chain for the Volvo Group, prepare logistics for new products anddrive the Sales & Operations Planning process. We strive for an innovative and diverse workplace, based upon thevalues of Volvo Group always with high focus on customer success." What you will do? Join our team to design and deliver digital solutions using analytics and automation tools that empower logisticsand supply chain operations — all while driving Agile IT Change Management for transportation systems Who’s your new team? We are a tight-knit group with the sole mission of delivering the best supply chain solutions to our Volvo Group North American production plants and Distribution Centers. With our can-do attitude and critical thinking team members, we continue to push the envelope of supply chaindesign, automation, visualization, and lean thinking. We work with a suite of software to create a logistics “digital twin” of our transport system. We handle everythingfrom design (Oracle Transport Management, SAP and Volvo IT systems) to data collection (SQL, Azure, etc.), todata visualization (Power BI, Qlik etc.).Our goal is to ensure smooth flow of information between these systems and our stakeholders so that a quality,low cost, and environmentally conscious transport is secured. This position will work out of Bengaluru, India location and will report to Value Chain Development and AnalyticsManager. What’s your role in the team? Do you dream big? We do too, and we are excited to grow together. You will be responsible for researching new technologies and methods across advanced data analytics and datavisualization. Explore opportunities at the intersection of ML, AI, automation, and big data to drive innovation in logistics networkdesign, visibility, and management. You will take ambiguous problems from our business and create integrated solutions using various structured andunstructured data (SQL, Azure DB, Volvo Systems etc.). The key objective is to proactively inform and manage change of the logistics network design using internal andexternal data. You will do this using automation (to retrieve & combine data sets) and advanced (predictive/prescriptive) analyticsto find correlation or causality within millions of data entries and across several systems. You will collaborate with cross-functional teams to evaluate and implement change requests in IT systems. Afterdeployment, lead regression testing efforts to verify that existing features continue to perform as expected andthat no unintended issues arise. You will be the liaison of adoption of these new technologies by the team. As a change leader, you will find orcreate education material and coach all stakeholders impacted by newfound technologies. Who are you? We are looking for candidates with the following skills, knowledge, and experience Bachelor’s degree in Computer Science, Information Technology, Data Science, Industrial Engineering, or arelated field Strong knowledge of relational databases (i.e. SQL Server, Azure SQL DB) and proficiency in writing complexSQL queries Proven experience with ETL (Extract, Transform, Load) processes for data integration and transformation acrossmultiple systems Proficiency in data visualization tool(s) (i.e. Qlik, PowerBI, etc.) and Microsoft PowerApps products Hands on skill and experience in one programming language used in data science applications (such as Pythonor R) Experience working with IT system change management processes and regression testing Strong verbal and written communication skills, with the ability to clearly convey complex ideas to diversestakeholders Demonstrated ability to operate effectively across time zones and collaborate with globally distributed teams Are you excited to bring your skills and disruptive ideas to the table? We can’t wait to hear from you. Apply today! We value your data privacy and therefore do not accept applications via mail. Who we are and what we believe in We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group’s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Trucks Operations encompasses all production of the Group’s manufacturing of Volvo, Renault and Mack trucks, as well as engines and transmissions. We also orchestrate the spare parts distribution for Volvo Group’s customers globally and design, operate and optimize logistics and supply chains for all brands. We count 30,000 employees at 30 plants and 50 distribution centers across the globe. Our global footprint offers an opportunity for an international career in a state-of-the-art industrial environment, where continuous improvement is the foundation. As our planet is facing great challenges, we - one of the largest industrial organizations in the world - stand at the forefront of innovation. We are ready to rise to the challenge. Would you like to join us?
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France