Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a highly skilled Python Developer with 4-8 years of experience, including visualization experience using tools like Power BI, QlikSense, Python, ETL concepts, and SQL. In this role, you will be part of a dynamic team at KPMG in India, working closely with key stakeholders in both business and IT to translate data into compelling visual insights to drive business decision-making. Your responsibilities will include creating complex Power BI reports end-to-end, collaborating with stakeholders to define data requirements, ensuring data security and compliance, communicating with key stakeholders to drive clarification of requirements, and promoting UX designs and solutions. You will also support the design by analyzing user journeys, driving simplification and automation, and enabling high-quality storytelling through data visualizations. Additionally, you will be required to document technical solution design and strategy documentation, troubleshoot and optimize Power BI solutions, design, develop, and optimize Qlik Sense applications and dashboards, mentor and coach junior developers, and possess excellent communication and documentation skills. An agile mindset is essential to work with Product Owners/Project Managers to break down complex requirements into MVP functionality and deliver enhancements every sprint. Your required skills include 4-6 years of experience as a Power BI developer, expertise in DAX, Data Modeling, ETL processes, implementing row-level security, Power BI Service, performance optimization, visualization techniques, and more. You should have proven abilities in designing and implementing scalable data models, knowledge of data integration techniques and tools, experience with data pipeline orchestration and automation, proficiency in SQL and data warehouse concepts, and expertise in performance tuning and optimization of Power BI reports and SQL queries. Moreover, you should be familiar with Qlik Sense architecture, ETL, visualization techniques, data modeling best practices, Qlik expressions, custom extensions development, server administration, NPrinting, and ability to architect end-to-end BI solutions. Python development experience with exposure to frameworks such as Django & Flask, familiarity with advanced analytics, machine learning concepts, Agile methodologies, and practices is also required. A Bachelor's degree in Computer Science, Information Technology, or a related field is preferred, and relevant certifications will be considered a plus. KPMG in India is an equal employment opportunity provider.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining as an Oracle PL/SQL Developer with 2 to 4 years of experience, based in Hyderabad. Your primary responsibilities will include working with Oracle 11g/Oracle 12c, and leveraging advanced PL/SQL skills for package, procedures, functions, triggers, and batch coding. You should also have a strong grasp of performance tuning techniques using SQL Trace, Explain Plan, Indexing, and Hints. To excel in this role, you must possess a Bachelor's degree in Computer Science or a related field, along with at least 2 years of hands-on experience in Oracle PL/SQL development. A solid understanding of relational databases, proficiency in writing and optimizing PL/SQL code, and familiarity with database design and data modeling are essential requirements. Additionally, you should be well-versed in database performance tuning and optimization strategies, database backup and recovery processes, and version control systems like Git or SVN. Knowledge of data integration and ETL processes, along with experience in Agile/Scrum environments, will be advantageous. Strong analytical and problem-solving skills are crucial, along with the ability to collaborate effectively in a team setting. Excellent communication skills, both verbal and written, are highly valued. Certifications in Oracle technologies would be a plus, and familiarity with programming languages such as Java or Python is considered beneficial. If you meet these qualifications and have the required skills in database performance tuning, SQL querying, ETL processes, database design, and PL/SQL development, we encourage you to apply for this exciting opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Genpact is a global professional services and solutions firm with a workforce of over 125,000 individuals spread across more than 30 countries. Our team is motivated by curiosity, agility, and a commitment to delivering enduring value to our clients. Our purpose, the relentless pursuit of a world that functions better for people, drives us to serve and transform leading enterprises worldwide, including Fortune Global 500 companies. Our expertise lies in deep business and industry knowledge, digital operations services, and proficiency in data, technology, and AI. We are currently seeking applications for the position of Senior Manager - Solutions Designer, specializing in Banking and Capital Markets. In this role, you will collaborate closely with clients to comprehend their requirements for AI solutions and craft scalable AI architectures using advanced machine learning and deep learning models. Your responsibilities will include developing comprehensive solution proposals, encompassing technical architectures and implementation strategies. You will work alongside data scientists and engineers to prototype, construct, and deploy AI models, ensuring their seamless integration with existing systems. Furthermore, you will provide continuous technical guidance, conduct performance evaluations, and remain abreast of emerging AI technologies. Your role will also involve designing AI solutions that encompass OCR, RPA, UX/UI, decisioning engines, data integration, and visualization. Additionally, you will support client governance processes, aid in requirements gathering, and contribute to business case development. You will also play a vital role in advocating for the capabilities and services offered by the AI/ML CoE. Your responsibilities will include collaborating with clients to comprehend their business objectives, challenges, and AI solution requirements. You will be tasked with designing and architecting scalable AI solutions that make use of cutting-edge machine learning algorithms, deep learning models, and other AI techniques. Moreover, you will be responsible for developing detailed solution proposals, including technical architectures, implementation plans, and project timelines. Working in close coordination with data scientists and machine learning engineers, you will prototype, construct, and deploy AI models and applications. You will lead the integration of AI solutions with existing systems, ensuring seamless interoperability and scalability. Additionally, you will provide technical guidance and support throughout the project lifecycle, conducting regular assessments and optimizations to enhance the performance and efficacy of deployed AI solutions. Staying updated on emerging AI technologies, trends, and best practices will be crucial to enhancing the company's capabilities and offerings. You will also design AI-based solutions that combine with other reusable capabilities and services such as OCR, RPA, UX/UI, decisioning and recommendation engines, data integration, and visualization. Supporting client internal governance processes, participating in architectural, security, risk, and other reviews, and assisting in requirements gathering, business case creation, and stakeholder engagement will be part of your responsibilities. Additionally, you will collaborate with the team to promote the capabilities and services of the AI/ML CoE. Qualifications we seek in you! Minimum Qualifications: - Bachelor's or Master's degree in computer science, engineering, mathematics, or a related field. Advanced degrees or certifications in artificial intelligence or machine learning are advantageous. - Demonstrated experience in designing and delivering AI solutions, preferably within a consulting or professional services environment. - Strong expertise in AI technologies, including machine learning frameworks like TensorFlow, PyTorch, natural language processing (NLP), computer vision, and reinforcement learning. Preferred Qualifications/Skills: - Comprehensive understanding of software engineering principles, cloud computing platforms like AWS, Azure, Google Cloud, and DevOps practices. - Outstanding analytical and problem-solving abilities, with the capacity to translate complex business requirements into practical AI solutions. - Effective communication and presentation skills, with the capability to explain technical concepts to non-technical audiences. - Proven ability to work efficiently in cross-functional teams and collaborate with diverse stakeholders. Job Details: - Job Title: Senior Manager - Primary Location: India-Noida - Schedule: Full-time - Education Level: Master's or Equivalent - Job Posting Date: October 4, 2024, 2:47:49 AM - Unposting Date: November 3, 2024, 12:29:00 PM - Master Skills List: Digital - Job Category: Full Time,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Manager (MDG-Material Master) at Kenvue, a leading healthcare company focused on enhancing lives globally, you will be responsible for overseeing and optimizing the Master Data Management (MDM) technology framework specifically for the Material Master data domain. Your role will involve designing, implementing, and maintaining a robust MDM technology infrastructure to ensure data integrity, consistency, and accuracy across the organization. Collaboration with cross-functional teams will be essential to establish and enforce technical excellence, policies, standards, and security measures aligning with Kenvue's strategic objectives. Your key responsibilities will include designing, developing, and implementing material/product master data management solutions utilizing cutting-edge tools such as SAP MDG On-Premise. You will also be tasked with developing and maintaining data models, data mappings, and data integration workflows, as well as implementing data quality rules to ensure accuracy and consistency in data. Collaborating with various teams to ensure data governance and regulatory compliance, providing guidance on MDM/SAP MDG best practices, and staying updated on emerging trends in the MDM space, including generative AI, will be crucial aspects of your role. Additionally, you will play a vital role in implementing master data management policies, processes, standards, capabilities, and tools organization-wide. This will involve overseeing MDM tools and technology implementation for governance of master data objects throughout the company. You will also be responsible for developing and delivering training programs on master data tools and technology to global process experts and end-users, managing a team of master data technologists, and influencing senior stakeholders on the business value of master data for Kenvue. To qualify for this role, you should hold a Bachelor's degree in computer science, Information Systems, or a related field, with a Master's degree being preferred. You should have at least 10 years of experience in designing, developing, and implementing master data management solutions using MDM/SAP MDG tools and technologies. An understanding of generative AI in the master data context, experience in the Material Master domain within healthcare, and familiarity with MDM technologies like SAP MDG, augmented MDM with machine learning, and workflow orchestration with SAP Fiori and SAP BTP are required. Strong analytical, problem-solving, and decision-making skills, excellent communication and interpersonal abilities, and the capacity to work independently and as part of a team are essential for this role. You should also have experience working with high-performing teams, building relationships, and holding external service partners accountable. Demonstrating exceptional relationship-building skills, influencing capabilities, and leadership in a complex matrixed environment will be key to your success in this position. Join Kenvue in contributing to our mission of improving global healthcare through effective MDM Technology. If you meet the qualifications and possess the necessary skills, we encourage you to apply for this Manager (MDG-Material Master) role based in Bangalore, India.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking a highly skilled and motivated Data Analyst with experience in ETL services to join our dynamic team. As a Data Analyst, you will be responsible for data requirement gathering, preparing data requirement artefacts, data integration strategies, data quality, data cleansing, optimizing data pipelines, and solutions that support business intelligence, analytics, and large-scale data processing. You will collaborate closely with data engineering teams to ensure seamless data flow across our systems. The role requires hands-on experience in the Financial Services domain with solid Data Management, Python, SQL & Advanced SQL development skills. You should have the ability to interact with data stakeholders and source teams to gather data requirements, understand, analyze, and interpret large datasets, prepare data dictionaries, source to target mapping, reporting requirements, and develop advanced programs for data extraction and analysis. Key Responsibilities: - Interact with data stakeholders and source teams to gather data requirements - Understand, analyze, and interpret large datasets - Prepare data dictionaries, source to target mapping, and reporting requirements - Develop advanced programs for data extraction and preparation - Discover, design, and develop analytical methods to support data processing - Perform data profiling manually or using profiling tools - Identify critical data elements and PII handling process/mandates - Collaborate with technology team to develop analytical models and validate results - Interface and communicate with onsite teams directly to understand requirements - Provide technical solutions as per business needs and best practices Required Skills and Qualifications: - BE/BTech/MTech/MCA with 3-7 years of industry experience in data analysis and management - Experience in finance data domains - Strong Python programming and data analysis skills - Strong advance SQL/PL SQL programming experience - In-depth experience in data management, data integration, ETL, data modeling, data mapping, data profiling, data quality, reporting, and testing Good To have: - Experience using Agile methodologies - Experience using cloud technologies such as AWS or Azure - Experience in Kafka, Apache Spark using SparkSQL and Spark Streaming or Apache Storm Other Key capabilities: - Client facing skills and proven ability in effective planning, executing, and problem-solving - Excellent communication, interpersonal, and teamworking skills - Multi-tasking attitude, flexible with ability to change priorities quickly - Methodical approach, logical thinking, and ability to plan work and meet deadlines - Accuracy and attention to details - Written and verbal communication skills - Willingness to travel to meet client needs - Ability to plan resource requirements from high-level specifications - Ability to quickly understand and learn new technology/features and inspire change within the team and client organization EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across assurance, consulting, law, strategy, tax, and transactions. EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Database Developer and Designer, you will be responsible for building and maintaining Customer Data Platforms (CDP) databases to ensure performance and stability. Your role will involve optimizing SQL queries to improve performance, creating visual data models, and administering database security. Troubleshooting and debugging SQL code issues will be a crucial part of your responsibilities. You will be involved in data integration tasks, importing and exporting events, user profiles, and audience changes to Google BigQuery. Utilizing BigQuery for querying, reporting, and data visualization will be essential. Managing user and service account authorizations, as well as integrating Lytics with BigQuery and other data platforms, will also be part of your duties. Handling data export and import between Lytics and BigQuery, configuring authorizations for data access, and utilizing data from various source systems to integrate with CDP data models are key aspects of the role. Preferred candidates will have experience with Lytics CDP and CDP certification. Hands-on experience with at least one Customer Data Platform technology and a solid understanding of the Digital Marketing Eco-system are required. Your skills should include proficiency in SQL and database management, strong analytical and problem-solving abilities, experience with data modeling and database design, and the capability to optimize and troubleshoot SQL queries. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, familiarity with Google Cloud Platform services, and a strong grasp of data security and access management are essential. You should also be proficient in Lytics and its integration capabilities, have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication and project management skills, and the ability to learn new CDP technologies and deliver in a fast-paced environment. Ultimately, your role is crucial for efficient data management and enabling informed decision-making through optimized database design and integration.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Java Microservices Engineer at our Pune, India location, you will be an integral part of our agile development team. Your primary responsibility will involve proposing end-to-end technical solutions for complex business problems, creating solution design documents, and collaborating with solution architects and data architects. Additionally, you will play a key role in grooming and leading junior developers while ensuring the design, development, and QA of code-based assets within an agile delivery context. In this role, you will need to have a strong background in high-performance, highly resilient microservice Java-based middle-tier development utilizing the Spring-Cloud framework. Your experience with server-side development, data processing, networks, protocols, and agile/continuous integration/test technologies such as git/stash, Jenkins, Artifactory, Appium, Selenium, SonarQube will be crucial. You should also be proficient in data modeling, SQL, developing scalable applications using Kafka, and working with API-based services. Key Responsibilities: - Proposing E2E Technical Solutions for complex Business Problems. - Creating Solution Design Documents. - Collaborating with Solution Architects and Data Architects. - Grooming and Leading Junior Developers. - Producing code-based assets within an agile delivery context. - Ensuring compliance with coding guidelines and standards. - Performing review of component integration testing, unit testing, and code review. - Writing high-performance, highly resilient microservice Java-based middle-tier development. - Experience with SOA (SOAP/Rest/OData) and developing scalable applications using Kafka. - Good understanding of relational databases, data models, and SQL. - Working in a fast-paced, high-energy team environment. Skills And Experience: - Excellent communication and influencing skills. - Open-mindedness. - Ability to work in a fast-paced environment. - Passion for sharing knowledge and best practices. - Ability to work in virtual teams and matrixed organizations. - Project management and people management skills. - Fluent English (written/verbal). Education/Certification: - Bachelor's degree from an accredited college or university with a concentration in Science, Engineering, or an IT-related discipline (or equivalent). We offer a supportive environment with training, development, coaching, and a culture of continuous learning to help you excel in your career. Join us at Deutsche Bank Group, where we strive for a culture of empowerment, responsibility, commercial thinking, initiative, and collaboration. For further information about our company and teams, please visit our website: [Deutsche Bank Company Website](https://www.db.com/company/company.htm). We welcome applications from all individuals and promote a positive, fair, and inclusive work environment.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Integration Consultant at SAP, you will play a crucial role in SQL code conversions, data integration projects, and ensuring seamless data flow across systems. Your primary responsibility will involve hands-on experience in Titan migration and/or SAP Sales performance management implementation. You will be instrumental in the migration process from ORACLE platforms to the HANA platform. Your role will require you to bring in your functional and business skills to effectively manage data integration tasks. While certifications in related fields are preferred, they are not mandatory. We are looking for individuals who possess the necessary qualifications and have a passion for contributing to a dynamic team environment. Located in Hyderabad, you will join a highly collaborative and caring team at SAP, where we value learning and development. We recognize and appreciate individual contributions, providing a variety of benefit options for you to choose from. SAP is a purpose-driven and future-focused company with a commitment to personal development and a strong team ethic. With a focus on inclusion, health, and well-being, we ensure that every individual, regardless of background, feels valued and empowered to perform at their best. We believe in unleashing all talents and fostering a more equitable world. As part of our commitment to equal opportunity, SAP is proud to be an affirmative action employer. We provide accessibility accommodations to applicants with physical and/or mental disabilities, ensuring a supportive and inclusive recruitment process. If you require assistance with your application, please reach out to our Recruiting Operations Team at Careers@sap.com. At SAP, we believe in the unique capabilities and qualities that each individual brings to our company. We invest in our employees, inspiring confidence and helping them realize their full potential. Together, we strive to create a better and more inclusive world. Please note that successful candidates may undergo a background verification process with an external vendor as part of the onboarding procedure. Join us at SAP to bring out your best and contribute to our mission of helping the world run better.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the Providence Cybersecurity (CYBR) team, you will play a crucial role in safeguarding all information pertaining to caregivers, affiliates, and confidential business data. Your responsibilities will include collaborating with Product Management to assess use cases, functional requirements, and technical specifications. You will conduct data discovery and analysis to identify crucial data from source systems for meeting business needs. Additionally, you will be tasked with developing conceptual and logical data models to validate requirements, highlighting essential entities, relationships, and documenting assumptions and risks. Your role will also involve translating logical data models into physical data models, creating source-to-target mapping documentation, and defining transformation rules. Supporting engineering teams in implementing physical data models, applying transformation rules, and ensuring compliance with data governance, security frameworks, and encryption mechanisms in cloud environments will be a key part of your responsibilities. Furthermore, you will lead a team of data engineers in designing, developing, and implementing cloud-based data solutions using Azure Databricks and Azure native services. The ideal candidate for this role will possess a Bachelor's degree in a related field such as computer science, along with certifications in Data Engineering, cyber security, or equivalent experience. Experience in working with large and complex data environments, expertise in data integration patterns and tools, and a solid understanding of cloud computing concepts and distributed computing principles are essential. Proficiency in Databricks, Azure Data Factory (ETL Pipelines), and Medallion Architecture, along with hands-on experience in designing and implementing data solutions using Azure cloud services, is required. Strong skills in SQL, Python, Spark, data modelling techniques, dimensional modelling, and data warehousing concepts are crucial for this role. Relevant certifications such as Microsoft Certified: Azure Solutions Architect Expert or Microsoft Certified: Azure Data Engineer Associate are highly desirable. Excellent problem-solving, analytical, leadership, and communication skills are essential for effectively communicating technical concepts and strategies to stakeholders at all levels. You should also demonstrate the ability to lead cross-functional teams, drive consensus, and achieve project goals in a dynamic and fast-paced environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Data Engineer, you will be responsible for designing, developing, and implementing data pipelines using StreamSets Data Collector. Your role involves ingesting, transforming, and delivering data from diverse sources to target systems. You will write and maintain efficient, reusable pipelines while adhering to coding standards and best practices. Additionally, developing custom processors and stages within StreamSets to address unique data integration challenges is a key aspect of your responsibilities. Ensuring data accuracy and consistency is crucial, and you will implement data validation and quality checks within StreamSets pipelines. Optimizing pipeline performance for high-volume data processing and automating deployment and monitoring using CI/CD tools are essential tasks you will perform. In terms of quality assurance and testing, you will develop comprehensive test plans and test cases to validate pipeline functionality and data integrity. Thorough testing, debugging, and troubleshooting of pipelines will be conducted to identify and resolve issues. You will also standardize quality assurance procedures for StreamSets development and perform performance testing and tuning to ensure optimal pipeline performance. When it comes to problem-solving and support, you will research and analyze complex software-related issues to provide effective solutions. Timely resolution of production issues related to StreamSets pipelines is part of your responsibility. Providing technical support and guidance to team members on StreamSets development and monitoring pipeline logs and metrics for issue identification and resolution are also key tasks. Strategic alignment and collaboration are essential aspects of the role. Understanding and aligning with departmental, segment, and organizational strategies and objectives are necessary. Collaboration with data engineers, data analysts, and stakeholders to deliver effective data solutions is crucial. Documenting pipeline designs and configurations, participating in code reviews, and contributing to the development of data integration best practices and standards are also part of your responsibilities. To qualify for this role, you should have a Bachelor's Degree in Computer Science, Information Technology, or a related field. A minimum of 3-5 years of hands-on experience in systems analysis or application programming development with a focus on data integration is required. Proven experience in developing and deploying StreamSets Data Collector pipelines, strong understanding of data integration concepts and best practices, proficiency in SQL, experience with relational databases, various data formats (JSON, XML, CSV, Avro, Parquet), cloud platforms (AWS, Azure, GCP), and cloud-based data services, as well as experience with version control systems (Git) are essential qualifications. Strong analytical and problem-solving skills, excellent communication and collaboration abilities, and the capacity to work independently are also necessary for this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
This role is part of ADSs Global Technical Organization (GTO). GTO delivers world-class analytics and digital solutions on Ecolabs Product Quality, Process Safety & Sustainability landscape. The team is responsible to maintain the highest quality standards by tracking performance metrics, identifying quality issues, and driving continuous improvement. Are you a driven individual with a passion for data and analytics We're looking to hire an Analyst (GTO) to join our Global Supply Chain Analytics team in Pune, India. In this role, you'll be responsible for leading initiatives in Digital Twin, Enterprise Quality KPIs, and to provide support in functional product deployments and maintenance. You'll work closely with the quality functional stakeholders, data engineering, and design teams to co-create quality products and solutions to get them design reviewed before formal product launch. You'll also serve to bring functional thought leadership and industry best practices for experimentations and deployments. Data Collection & Integration Gather and integrate data from manufacturing systems, supply chain platforms, and IoT devices using OSI PI. Ensure real-time data availability, reliability, and accuracy to support operational needs. Configure and maintain data points, historians, and visualization tools. Data Analysis and Visualization: Utilize Seeq for advanced data analytics and trend analysis to identify patterns and root causes of inefficiencies as well as analysis on various quality indicators. Create dashboards, reports, and KPIs using PowerBi, Redzone to provide actionable insights to stakeholders. Work closely with process engineers/plant managers to identify and manage process bottlenecks and drive operational efficiency. Product Development and Deployment Work on various plant-level process efficiency improvement, automation and controls and Digital Twin projects like Aveva, Redzone among others. Leverage product quality methodologies like Capable / non-capable specs, spec deviations, supplier quality audits and assessments for overall Quality KPIs, root cause analysis (RCA), CAPA, and Supplier Corrective Actions (SCAR). Functional Knowledge Management Contribute to knowledge management and continuous improvement initiatives. Partner with cross-functional teams to ensure successful project delivery. Stay updated on the latest developments in supply chain technologies and provide recommendations for system upgrades or integrations. Act as a subject matter expert on OSI PI, Seeq, and Redzone, providing training and support as needed. What you bring: Bachelor's degree in chemical, mechanical engineering, computer science, information systems, or a related field (advanced degree a plus). 2+ years of experience in quality foundations and philosophies like 8D, problem-solving, statistical process control (SPC), RCA, process capability, value stream mapping, and six sigma principles. Good to have six sigma green belt certification, data historians like OSI PI, & real-time analytics tools like Seeq, Redzone. Knowledge of Digital Twins, LIMS. Understanding of Manufacturing Key Performance Indicators. Excellent analytical and quality problem-solving skills with a keen eye for detail. Entry-level exposure to technical skills in visualization using MS Power BI and familiarity with programming skills in Python and M language. Good to have exposure in cloud platforms (Azure / AWS) and data management in SQL querying and SAP tables/transactions, Snowflake. Strong communication and collaboration skills. Ability to thrive in a fast-paced environment and manage multiple projects. What's in it For You: Opportunity to join a growth company offering competitive compensation and benefits. The ability to make an impact and shape your career with a company passionate about grooming talent for creating future leaders. The support of an organization that believes it is vital to include and engage diverse people, perspectives, and ideas to achieve our best. Feel proud daily to work for a company that provides clean water, safe food, abundant energy, and healthy environments. Our Commitment to Diversity and Inclusion,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You are looking for an EDMCS Specialist with 7-8 years of experience to be based in Bangalore with a remote work option. As an EDMCS Specialist, you will be responsible for leading end-to-end delivery of Enterprise Data Management Cloud Services (EDMCS) projects. Your role will involve translating business ideas into technical solutions, independently managing projects, and collaborating with cross-functional teams as needed. Your expertise in EDMCS configuration, solution design, and business analysis will be crucial in delivering high-quality results. Key Responsibilities: - Lead EDMCS projects from requirement gathering to implementation and delivery. - Translate business requirements into technical solutions and implement them efficiently. - Independently manage EDMCS projects while collaborating with cross-functional teams. - Analyze business and technical requirements to develop comprehensive EDMCS solutions. - Configure metadata, mappings, applications, and data integration within EDMCS. - Provide innovative solutions to enhance existing architecture and design. - Develop and configure EDMCS extracts for General Ledger and Consolidation applications. - Ensure data accuracy and compliance with reconciliation requirements. Required Skills and Qualifications: - 7-8 years of experience in EDMCS or related fields, with a proven track record in end-to-end project delivery. - Hands-on experience in implementing full life cycle EDMCS projects. - Strong knowledge of metadata configuration, mapping setup, and data integration. - Expertise in developing EDMCS extracts for GL and Consolidation applications. - Solid understanding of transaction matching, reconciliation compliance, and data accuracy. - Strong problem-solving skills and ability to provide innovative solutions. - Excellent verbal and written communication skills for effective collaboration. Preferred Qualifications: - Experience with Oracle Cloud or other cloud technologies. - Previous experience working in cross-functional teams in dynamic environments. - Familiarity with Agile or other iterative project management methodologies. This is a full-time position with the benefit of working from home. The ideal candidate will have a total of 8 years of work experience.,
Posted 1 week ago
8.0 - 13.0 years
13 - 17 Lacs
Noida, Pune, Bengaluru
Work from Office
Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership- Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review- Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking- Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship- Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Designing technical architecture Agile PySpark AWS Data Pipeline Data Modelling Matillion Databricks Location - Noida,Bengaluru,Pune,Hyderabad,India
Posted 1 week ago
3.0 - 7.0 years
35 - 100 Lacs
Bengaluru
Work from Office
QA Analyst - ETL and API Req number: R5774 Employment type: Full time Worksite flexibility: Remote Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are seeking a skilled Manual Tester specializing in ETL and API testing to join our team. The ideal candidate will have a strong foundation in SQL, data migration, and data integration testing, with a keen eye for detail and a methodical approach to validating complex systems and are looking for your next career move, apply now. Job Description We are looking for a QA Analyst (ETL and API) to perform data migration and data integration testing, validating ETL processes, data quality, and transformation rules. This position will be Full Time and remote. What You’ll Do Utilize strong hands-on experience in SQL to validate data across multiple layers and systems (OLTP to OLAP). Perform data migration and data integration testing, validating ETL processes, data quality, and transformation rules. Conduct API testing using Postman to validate REST APIs, request/response structures, and business logic. Automate testing in Postman using JavaScript, creating test scripts for dynamic validations and workflow chaining. Exposure to performance testing using JMeter, creating and executing load/stress test plans and analyzing performance metrics. Employ working experience with Rest Sharp and Rest Assured libraries for API automation testing. Develop and execute test automation frameworks using NUnit. Identify test scenarios, create test cases, and execute tests with a focus on data-driven validation. Familiarize with CI/CD pipelines and integrate automated tests in the deployment workflow. Demonstrate strong debugging and analytical skills to investigate test failures and data mismatches. Exhibit excellent communication and documentation skills for reporting test results and collaborating with cross-functional teams. What You'll Need Required: Bachelor’s degree in Computer Science, Information Technology, or a related field. Experience: 3 – 5 years Proven experience in manual testing with a focus on APIs. Proficient in SQL, Postman, and JavaScript for test automation. Experience with performance testing tools like JMeter. Familiarity with API automation testing libraries such as Rest Sharp and Rest Assured. Knowledge of NUnit for test automation framework development. Experience with CI/CD pipelines. Physical Demands Ability to safely and successfully perform the essential job functions Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc. Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Noida
Work from Office
???Are you passionate about data architecture and solving real-world problems with modern Big Data technologies About The Role We are seeking a highly skilled Resident Solution Architect to join our team This professional will work closely with clients to understand their business needs, architect solutions, and implement modern Big Data platform technologies, ???? Key Responsibilities: Collaborate with customers to understand their business goals, data architecture, and technical requirements Design end-to-end solutions that leverage Starburst products to address customer needs, including data access, analytics, and performance optimization Develop architectural diagrams, technical specifications, and implementation plans for customer projects Lead the implementation and deployment of Starburst solutions, working closely with customer teams and internal stakeholders Provide technical guidance and best practices to customers on using Starburst products effectively Work with partners to train and upskill external personnel on Starburst Products Collaborate with internal teams and external partners to create resources, best practices, and delivery processes Troubleshoot and resolve technical issues during implementation and operation Stay current on industry trends, emerging technologies, and data management best practices ???? Required Experience & Knowledge: Bachelor's degree in Computer Science, Engineering, or a related field Professional experience in technical roles such as solution architect, data engineer, or software engineer Strong understanding of data architecture principles, including data modeling, data integration, and data warehousing Proficiency in SQL and experience with distributed query engines ( e-g , Presto, Trino, Apache Spark) Strong problem-solving skills and strategic thinking for technical and business challenges Excellent communication and interpersonal skills Experience with cloud platforms ( e-g , AWS, Azure, Google Cloud) and containerization ( e-g , Docker, Kubernetes) is a plus Experience with open-source technologies and/or contributions to the open-source community is a plus Experience delivering technical training, internally or externally, is a plus Fluent English is required for daily communication with international clients and teams ???? Other Information: This is a client-facing, hands-on role with high impact on customer success Remote work opportunity Clients are typically based in the USA, and work must be delivered within the customers timezone ???? Why Join Us Work on impactful, client-facing projects with cutting-edge data technologies Collaborate with a global, experienced team of professionals Flexible remote work setup and opportunity to grow within a fast-paced environment ???Ready to join usApply now and bring your data expertise to the next level! Show
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Proficiency in JavaScript or TypeScript. Strong knowledge of React and React Native. Experience with mobile app development principles and patterns. Experience with UI/UX design principles and their implementation. Understanding of RESTful APIs and data integration. Experience with state management libraries like Redux. Experience with navigation libraries like React Navigation. Problem-solving and debugging skills. Version control using Bitbucket. Continuous learning and adaptation to new technologies and best practices.
Posted 1 week ago
2.0 - 4.0 years
1 - 4 Lacs
Bengaluru
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Market Data Integration Support - Techno Functional Specialist LocationBengaluru Experience2 to 4 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Market Data Specialist, Capital Market knowledge Apex Group Ltd has a requirement for Market Data Integration Specialist. We are seeking an inquisitive and analytical thinker who will be responsible for ensuring the quality, accuracy, and consistency of pricing & reference data with recommended data providers in financial domain such as Bloomberg, Refinitiv and Markit. Role is responsible for developing approaches, logic, methodology and business requirements for validating, normalizing, integrating, transforming, and distributing data using data platforms and analytics tools. Candidate will be responsible for maintaining the integrity of organisational critical data and supporting data-driven decision-making. Candidate will be a data professional with a technical and commercial mindset, as well as an excellent communicator with strong stakeholder management skills. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Technical/ Functional Expertise Required Develop an understanding of reference and master data sets, vendor data (Bloomberg, Refinitiv, Markit) and underlying data architecture, processes, methodology and systems. Should have strong knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Develop automated frameworks to produce source and target mappings, data load and extraction process, data pre-processing, transformation, integration from various sources and data distribution. Work with business to analyse and understand business requirements and review/produce technical and business specification with focus on reference data modelling. Integrate business requirements into logical solution through qualitative and quantitative data analysis and prototyping. Strong knowledge on overall pricing and static data concepts like different investment types, pricing types, vendor hierarchy, price methodology, market value concept. Analyse complex production issues and provide solution. Produce detailed functional and technical specification documents for development and testing. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA, and Advance Excel. Understanding of the funds administration industry is necessary. Intermediate knowledge of financial instruments, both listed and unlisted or OTCs which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Testing and troubleshooting integrations and technical configurations. Effectively multi-task, schedule and prioritize deliverables to meet the project timelines. Ensure operational guidelines are updated & adhere to standards, procedures & also identify plan to mitigate risks wherever there is a control issue. Ability to contribute towards critical projects for product enhancements and efficiency gains. Good understanding of Geneva, Paxus , or any other accounting system. Self - starter with a quick learning ability, possessing strong verbal and written communication skills, and have an ability to present effectively. Maintenance and creation of standard Operating Procedure. Proficiency in an accounting system, preferably Advent Geneva or Paxus would be added advantage. An ability to work under pressure with changing priorities. Experience and Knowledge: 3+ years of related experience in support/ technical in any accounting platform (Paxus/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Excellent communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
3.0 - 6.0 years
1 - 4 Lacs
Bengaluru
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Middle Office - Analyst - Business Systems - Location: Pune Experience: 3 - 6 years Designation: Associate Industry/Domain: ETL/Mapping Tool, VBA, SQL, Capital Market knowledge, Bank Debts, Solvas Apex Group Ltd has an immediate requirement for Middle Office Tech Specialist. As an ETL Techno-Functional Support Specialist at Solvas, you will be the bridge between technical ETL processes and end-users, ensuring the effective functioning and support of data integration solutions. Your role involves addressing user queries, providing technical support for ETL-related issues, and collaborating with both technical and non-technical teams to ensure a seamless data integration environment. You will contribute to the development, maintenance, and enhancement of ETL processes for solvas application, ensuring they align with business requirements. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Functional/ Business Expertise Required Serve as the primary point of contact for end-users seeking technical assistance related to Solvas applications. Serve as a point of contact for end-users, addressing queries related to ETL processes, data transformations, and data loads. Provide clear and concise explanations to non-technical users regarding ETL functionalities and troubleshoot issues. Integrate Client Trade files into the Conversant systemdesign, develop, implement, and test technical solutions based on client and business requirements. Diagnose and troubleshoot ETL-related issues reported by end-users or identified through monitoring systems. Work closely with business analysts and end-users to understand and document ETL requirements. Monitor ETL jobs and processes to ensure optimal performance and identify potential issues. Create user documentation and guides to facilitate self-service issue resolution. Hands on experience in working on any ETL tools is mandatory. Strong command of SQL, VBA and Advance Excel. Good understanding of Solvas or any other loan operation system. Mandatory to have good knowledge of Solvas Bank Debt working. Intermediate knowledge of financial instruments, both listed and unlisted or OTCs, which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Understanding of the Loan operation industry is necessary. Should have knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Proficiency in any loan operation system, preferably solvas. An ability to work under pressure with changing priorities. Strong analytical and problem-solving skills. Experience and Knowledge: 3+ years of related experience in support/ technical in any loan operation system & accounting system (Solvas/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Good communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 1 week ago
5.0 - 10.0 years
17 - 20 Lacs
Pune, Chennai, Bengaluru
Work from Office
Your Role Architect, design, and implement data collection strategies across various channels (web, mobile, offline, etc.) using Tealium iQ Tag Management. Develop and maintain Tealium AudienceStream segments and triggers for customer segmentation and activation. Integrate Tealium CDP with other marketing technology platforms (e.g., CRM, DMP, email marketing platforms, ad servers). Develop and maintain custom JavaScript for data collection and enrichment. Your Profile Hands-on experience with Tealium iQ Tag Management and AudienceStream. Strong understanding of data collection methodologies, data warehousing, and data integration principles. Experience with JavaScript, HTML, and CSS. Experience with API integrations and data exchange formats (e.g., JSON, XML). Strong analytical and problem-solving skills. Excellent communication, interpersonal, and collaboration skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications.Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work . About Capgemini Location - Pune,Bengaluru,Chennai,Hyderabad
Posted 1 week ago
5.0 - 7.0 years
5 - 9 Lacs
Gurugram
Work from Office
Assist in building out the backlog of Power BI dashboards, ensuring they meet business requirements and provide actionable insights. Collect and maintain a firmwide inventory of existing reports, identifying those that need to be converted to Power BI. Collaborate with the team to contract and integrate Snowflake, ensuring seamless data flow and accessibility for reporting and analytics. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Information Systems, Computer Science or a related field 3+ years strong experience in developing and managing Power BI dashboards and reports, preferably within the financial services industry. Experience required in Data Warehousing, SQL, and hands-on expertise in ETL/ELT processes. Familiarity with Snowflake data warehousing solutions and integration. Proficiency in data integration from various sources including APIs and databases. Proficient in SQL for querying and manipulating data. Strong understanding of data warehousing concepts and practices. Experience with deploying and managing dashboards on a Power BI server to service a large number of users. Familiarity with other BI tools and platforms. Experience with financial datasets and understanding Private equity metrics. Knowledge of cloud platforms, particularly Azure, Snowflake, and Databricks. Excellent problem-solving skills and attention to detail. Strong communication skills, both written and oral, with a business and technical aptitude Must possess good verbal and written communication and interpersonal skills Key Responsibilities Create and maintain interactive and visually appealing Power BI dashboards to visualize data insights. Assist in building out the backlog of Power BI dashboards, ensuring they meet business requirements and provide actionable insights. Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure, Snowflake, and Databricks. Collect and maintain a firmwide inventory of existing reports, identifying those that need to be converted to Power BI. Collaborate with the team to contract and integrate Snowflake, ensuring seamless data flow and accessibility for reporting and analytics. Continuously refine and improve the user interface of dashboards based on ongoing input and feedback. Monitor and optimize the performance of dashboards to handle large volumes of data efficiently. Work closely with stakeholders to understand their reporting needs and translate them into effective Power BI solutions. Ensure the accuracy and reliability of data within Power BI dashboards and reports. Deploy dashboards onto a Power BI server to be serviced to a large number of users, ensuring high availability and performance. Ensure that dashboards provide self-service capabilities and are interactive for end-users. Create detailed documentation of BI processes and provide training to internal teams and clients on Power BI usage Stay updated with the latest Power BI and Snowflake features and best practices to continuously improve reporting capabilities. Behavioral Competencies Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems Ability to work independently and multi-task effectively Identify and communicate areas for improvement Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 1 week ago
6.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
**Job Title:** ASC - Senior Power BI Developer **Location:** Pune, Maharashtra, India **Experience:** 6 - 8 years **Job Description:** We are seeking a skilled and experienced Senior Power BI Developer to join our dynamic team in Pune. The ideal candidate will have extensive experience in Power BI development with a solid understanding of DAX queries and administration. As a Senior Power BI Developer, you will play a key role in transforming data into actionable insights, and you will be responsible for designing and maintaining interactive dashboards and reports. **Key Responsibilities:** - Design and develop Power BI reports and dashboards that meet business requirements and provide actionable insights. - Create and optimize DAX queries for analytical calculations and data modeling. - Collaborate with stakeholders to gather requirements and translate them into effective BI solutions. - Manage Power BI administration tasks including user access, security, and performance tuning. - Ensure data quality and consistency across reports and dashboards. - Provide technical support and training to team members and end-users. - Stay updated with industry trends and best practices in data visualization and analysis. **Required Skills:** - 6 to 8 years of experience in Power BI development. - Strong proficiency in DAX Queries and data modeling using Power BI. - Experience with Power BI administration and user management. - Excellent problem-solving skills and ability to work independently as well as in a team. - Strong analytical and communication skills. - Familiarity with data warehousing concepts and ETL processes is a plus. **Preferred Qualifications:** - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Certifications in Power BI or relevant technologies would be an advantage. **What We Offer:** - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative work environment with a focus on innovation and creativity. If you are passionate about data and BI technology and meet the above requirements, we would love to hear from you! **How to Apply:** Interested candidates are invited to submit their resume along with a cover letter outlining their experience and suitability for the role to [insert application email or link]. --- *Note: Adjust the last section or contact details as per the actual application process required by your organization.* Roles and Responsibilities Job Description: Senior Power BI Developer We are seeking a highly skilled Senior Power BI Developer with over 6 years of extensive experience in Power BI development and administration to join our dynamic team. The ideal candidate will have a deep understanding of data visualization, business intelligence solutions, and the latest features of Power BI. This role involves working on Microsoft Fabric and requires strong expertise in SQL, with additional experience in Snowflake being a plus. Knowledge of Power Automate and Power Apps, as well as experience in migration projects, will be highly advantageous. Key Responsibilities Design, develop, and maintain advanced Power BI reports and dashboards to support business decision-making. Administer Power BI environments, including workspace management, security, and governance to ensure optimal performance and data integrity. Collaborate with stakeholders to gather and analyze business requirements, translating them into effective BI solutions. Work on Microsoft Fabric to integrate and manage data workflows, ensuring seamless data processing and reporting. Utilize the latest features of Power BI to enhance reporting capabilities and deliver innovative solutions. Write complex SQL queries to extract, transform, and load data from various sources for reporting purposes. Optimize data models and DAX calculations to improve performance and usability of Power BI reports. Participate in data migration projects, ensuring smooth transitions and minimal disruption to business operations, if applicable. Integrate Power Automate and Power Apps to automate workflows and enhance application functionalities, where required. Mentor junior developers and provide technical guidance on Power BI best practices and solutions. Required Skills and Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. 6+ years of experience in Power BI development and administration, with a proven track record of delivering high-quality BI solutions. Strong expertise in Microsoft Fabric for data integration and management. Proficient in SQL for data extraction, transformation, and analysis. In-depth knowledge of the latest Power BI features and updates, with hands-on experience in implementing them. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication skills to interact with technical and non-technical stakeholders. Preferred Skills Experience working with Snowflake for cloud data warehousing solutions. Knowledge of Power Automate and Power Apps for workflow automation and application development. Prior involvement in migration projects, particularly related to BI tools or data platforms. Familiarity with other BI tools or technologies is a plus
Posted 1 week ago
5.0 - 10.0 years
6 - 10 Lacs
Chennai
Remote
We are looking for a highly skilled Senior SQL Developer with strong ETL development experience and a solid background in data analysis . The ideal candidate will play a key role in designing and optimizing data pipelines, developing robust SQL queries, and transforming complex data sets into meaningful business insights. This position requires a combination of technical expertise, problem-solving skills, and a strategic mindset to support data-driven decision-making across the organization. Key Responsibilities: Design, develop, and optimize complex SQL queries, stored procedures, functions, and views for data extraction and reporting. Develop and maintain scalable ETL pipelines using tools such as Informatica, Talend, or custom scripts (Python, etc.). Collaborate with data architects, business analysts, and stakeholders to understand business requirements and deliver reliable data solutions. Analyze large datasets to uncover trends, identify anomalies, and support advanced analytics and reporting initiatives. Ensure data quality and integrity by performing thorough data validation and error handling. Monitor and optimize performance of SQL queries and ETL workflows. Participate in database design, modeling, and data warehouse architecture improvements. Document data flows, data models, and technical specifications. Mentor junior developers and contribute to code reviews and best practices. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or a related field. 5+ years of experience in SQL development and ETL processes. Proficiency in writing complex T-SQL (or PL/SQL) queries and performance tuning. Hands-on experience with ETL tools such as nformatica, Talend, or similar. Strong experience in working with relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL). Analytical mindset with experience in translating business requirements into data solutions. Experience with data warehousing concepts and dimensional data modeling. Proficient in data visualization and reporting tools such as Power BI or Tableau. Solid understanding of data governance, security, and compliance standards. Preferred: Experience with cloud-based data platforms (Azure Data Factory, AWS Glue, Google Cloud Dataflow). Knowledge of scripting languages like Python or shell scripting. Experience with Agile or DevOps methodologies. Strong understanding of business domains such as finance, healthcare, or e-commerce (if industry-specific) Work Environment: Remote work flexibility. Cross-functional team collaboration with data engineers, BI analysts, and business teams. Opportunities to work on enterprise-level data projects and emerging technologies. Please share your resume to srividyap@hexalytics.com.
Posted 1 week ago
5.0 - 10.0 years
8 - 12 Lacs
Pune, Gurugram
Work from Office
What youll do : Develop, configure, and optimize data integration processes using IICS. Design and implement complex ETL workflows to extract, transform, and load data from various sources into target systems. Collaborate with business analysts and data architects to understand data requirements and translate them into technical solutions. Implement data mappings, workflows, and sessions using Informatica IICS. Integrate data from on-premises databases, cloud applications, and third-party data sources. Ensure data quality and consistency through validation, cleansing, and transformation processes. Optimize ETL processes for maximum efficiency and scalability. Monitor and troubleshoot ETL jobs to ensure smooth operation and minimal downtime. Work closely with cross-functional teams including data architects, business analysts, and project managers to deliver data integration solutions. Document technical designs, mappings, workflows, and processes. Provide technical support and guidance to team members and stakeholders. Conduct unit and integration testing of ETL processes. Validate data accuracy and integrity post-migration and integration. Maintain and enhance existing ETL workflows and data integration processes. Monitor production environments, resolve issues, and ensure data availability and integrity. What youll bring : Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence Proficiency in RDBMS concepts, SQL, and programming languages such as Python Strong analytical and problem-solving skills to convert intricate business requirements into technology solutions Knowledge of algorithms and data structures Additional Skills : 1.5-2.5 years of experience in data integration and ETL development Previous experience with Informatica PowerCenter or other data integration tools is a plus. Experience with scripting languages (e.g., Python, Shell scripting) is desirable.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.