Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 19.0 years
0 Lacs
karnataka
On-site
Our client is looking for an experienced and visionary Country Head for India to lead and expand their business operations while driving sustainable growth. The ideal candidate should have expertise in data management, cloud computing, enterprise solutions, and IT services. As the Country Head, you will be responsible for shaping and executing the India growth strategy, managing P&L, nurturing client relationships, and ensuring seamless service delivery in collaboration with global stakeholders. Key Responsibilities: Strategic Leadership & Business Growth: - Develop and execute a comprehensive India business strategy aligned with global objectives. - Drive revenue growth, profitability, and market expansion across India. - Oversee sales, operations, and service delivery to ensure operational excellence. - Identify and capitalize on new market opportunities to drive customer acquisition and business expansion. - Own the P&L for India, ensuring financial targets are met while maintaining operational efficiency. Client Engagement & Market Expansion: - Build and nurture strategic relationships with enterprise clients, partners, and key stakeholders. - Act as a trusted advisor to CXOs and senior leadership, understanding their digital transformation needs and delivering tailored solutions. - Ensure high client satisfaction, engagement, and retention to foster long-term partnerships. Operations & Service Delivery Excellence: - Oversee end-to-end project delivery, ensuring high-quality, on-time execution. - Drive operational efficiencies, process improvements, and service innovation. - Ensure adherence to best practices in data governance, cloud computing, and analytics. - Collaborate with global teams to optimize resource utilization and project execution. Team Leadership & Cross-Functional Collaboration: - Build and lead a high-performing team across sales, operations, and service delivery. - Foster a culture of innovation, collaboration, and excellence. - Work closely with global leadership to align regional objectives with corporate goals. - Provide mentorship, coaching, and leadership to enhance team capabilities. Qualifications & Experience: - 15+ years of leadership experience in enterprise IT, data management, cloud solutions, or consulting. - Expertise in Master Data Management (MDM), Data Governance, AI/ML, and cloud-based analytics. - Proven track record of leading large teams, managing P&L, and scaling business operations. - Extensive experience in enterprise sales, account management, and strategic partnerships. - Strong ability to engage, influence, and build relationships with C-level executives and business leaders. - Deep understanding of India's technology landscape, market trends, and competitive ecosystem. - Excellent communication, negotiation, and stakeholder management skills. This role offers a unique opportunity for a dynamic leader to shape the future of a high-growth business, drive transformational outcomes, and lead a talented team in one of the world's most exciting markets.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Databricks Developer at Newscape Consulting, you will play a crucial role in our data engineering team, focusing on building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). Your responsibilities will include collaborating closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions that enhance user experience and productivity in the healthcare industry. Your key skills should include a strong hands-on experience with Databricks including Workspace, Jobs, DLT, Repos, and Unity Catalog. Proficiency in PySpark, Spark SQL, and optionally Scala is essential. You should also have a solid understanding of Delta Lake, Lakehouse architecture, and medallion architecture. Additionally, proficiency in at least one cloud platform such as Azure, AWS, or GCP is required. Experience in CI/CD for Databricks using tools like Azure DevOps or GitHub Actions, strong SQL skills, and familiarity with data warehousing concepts are essential for this role. Knowledge of data governance, lineage, and catalog tools like Unity Catalog or Purview will be beneficial. Familiarity with orchestration tools like Airflow, Azure Data Factory, or Databricks Workflows is also desired. This position is based in Pune, India, and is a full-time role with the option to work from the office. Strong communication, problem-solving, and stakeholder management skills are key attributes that we are looking for in the ideal candidate for this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
This role is for one of Weekday's clients. The ideal candidate should have a minimum of 3 years of experience and be located in Bengaluru for a full-time position. We are seeking a skilled and detail-oriented Informatica Data Quality (IDQ) Developer to join our data engineering team. As an IDQ Developer, your main responsibility will be to develop and implement data quality solutions using Informatica tools. This is crucial to support our organization's data governance, analytics, and reporting initiatives. The perfect candidate will possess a strong background in data profiling, cleansing, and standardization. They should also have a passion for enhancing data accuracy and reliability across enterprise systems. Your key responsibilities will include designing and developing IDQ solutions by building and configuring Informatica Data Quality mappings, workflows, and rules. You will also conduct data profiling and analysis on source systems to identify anomalies and inconsistencies. This will involve translating business data quality requirements into reusable rules and collaborating with ETL and data integration teams to embed data quality checks into workflows. Additionally, you will be responsible for monitoring data quality, troubleshooting technical issues, and maintaining technical documentation for IDQ solutions. Required skills and qualifications for this role include: - At least 3 years of experience in Informatica Data Quality development in a data-intensive or enterprise environment. - Strong hands-on experience with IDQ components such as mappings, mapplets, transformations, and data quality rules. - Proficiency in data profiling, cleansing, parsing, standardization, de-duplication, and address validation techniques. - Good knowledge of relational databases like Oracle, SQL Server, PostgreSQL, and the ability to write complex SQL queries. - Understanding of data governance, metadata management, and master data management concepts. - Experience working with data integration tools, especially Informatica PowerCenter, is a plus. - Strong problem-solving skills, attention to detail, and excellent communication and collaboration skills. - A Bachelor's degree in Computer Science, Information Systems, or a related field is required. Preferred qualifications include Informatica IDQ Certification, experience in regulated industries like banking, insurance, or healthcare, and familiarity with cloud-based data platforms such as AWS, GCP, or Azure.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PowerSchool, we are a dedicated team of innovators guided by our shared purpose of powering personalized education for students around the world. From the central office to the classroom to the home, PowerSchool supports the entire educational ecosystem as the global leader of cloud-based software for K-12 education. Our employees make it all possible, and a career with us means you're joining a successful team committed to engaging, empowering, and improving the K-12 education experience everywhere. Our Research & Development (R&D) team is the technical talent at the heart of our product suite, overseeing the product development lifecycle from concept to delivery. From engineering to quality assurance to data science, the R&D team ensures our customers seamlessly use our products and can depend on their consistency. This position, under the general direction of Engineering leadership, will be responsible for technical and development support for our award-winning K-12 software. You will use your knowledge to implement, code, build, and test new features, maintain existing features, and develop reports that will include components, data models, customization, and reporting features for our products. Your role will involve gathering and refining requirements, developing designs, implementing, testing, and documenting solutions to produce the highest quality product and customer satisfaction. Responsibilities: - Implement data replication and data ingestion software features and products following best practices. - Specialize in data engineering as a member of a project team. - Design and develop software engineering strategies. - Design and implement ETL processes to extract, transform, and load data from diverse sources. - Develop and optimize SQL queries for data extraction and transformation. - Perform data profiling, cleansing, and validation to ensure data accuracy and integrity. - Troubleshoot and resolve issues related to data integration processes. - Create and maintain documentation for ETL processes, data mappings, and transformations. - Stay abreast of industry best practices and emerging technologies in ETL and data integration. - Analyze performance and develop improvements to performance. - Assist and analyze security best practices. - Develop software to support internal initiatives, tools, update framework and application functionality. - Work as part of an Agile SCRUM team in the planning, scoping, estimation, and execution of technical solutions. - Other duties as assigned. Qualifications: - Bachelor's degree in Computer Science or Information Technologies required, or equivalent experience. - 5+ years" experience in a software engineer role. - Strong experience with Snowflake and various database platforms (MySQL, MSSQL, etc.). - Strong experience in TSQL and writing SQL transformations. - Strong experience in building data engineering pipelines using Python. - Experience with any replication technologies like SQL Replication, Fivetran, Qlik Replicate. - Understanding of data governance. - Experience in building CI/CD pipelines. - Excellent written and verbal communication skills. - Excellent ability to work with current software design principles and concepts such as patterns, algorithms. - Ability to handle a heavy workload while working on multiple projects and frequent interruptions. - Ability to work in a changing, dynamic environment. - Ability to provide an accurate and reliable estimate. - Willingness to work in a fast-paced environment.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
Continue to make an impact with a company that is pushing the boundaries of what is possible. At NTT DATA, we are renowned for our technical excellence, leading innovations, and making a difference for our clients and society. Our workplace embraces diversity and inclusion - it's a place where you can continue to grow, belong, and thrive. Your career here is about believing in yourself and seizing new opportunities and challenges. It's about expanding your skills and expertise in your current role and preparing yourself for future advancements. That's why we encourage you to take every opportunity to further your career within our great global team. We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. Key Responsibilities: - Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment - Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. - GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. - Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. - Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow - Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. - Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. - Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. - Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. - Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Requirements: - Bachelors degree in computer science, Engineering, or related fields (Master's recommended) - Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications - 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) - Proficiency in programming languages like SQL, Python, and PySpark - Strong data architecture, data modeling, and data governance skills - Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) - Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) - Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: - Experience with containerization and orchestration tools like Docker and Kubernetes - Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus - Familiarity with API gateway and service mesh architectures - Experience with low latency/streaming, batch, and micro-batch processing - Familiarity with Linux-based operating systems and REST APIs Location: Delhi or Bangalore Workplace type: Hybrid Working Equal Opportunity Employer,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
We are looking for a highly experienced Senior Biostatistician Consultant with over 8 years of experience in statistical modeling, platform development, and advanced analytical solutions. This role requires in-depth knowledge of biostatistics, statistical process control, and extensive experience in developing and deploying R Shiny-based analytical platforms. As a Senior Biostatistician Consultant, you will play a crucial role in building scalable statistical platforms that facilitate data-driven decision-making through advanced analytics. Your responsibilities will involve guiding scientific data interpretation, supporting both regulated and non-regulated environments, and delivering top-notch analytical solutions. Your key responsibilities will include designing, developing, and deploying robust R Shiny applications and platforms for real-time data analysis in scientific and manufacturing settings. You will collaborate with IT and analytics teams to ensure platform scalability, security, and user-centric design. Additionally, you will lead statistical modeling for complex datasets, emphasize univariate and multivariate analysis, and employ advanced techniques such as SPC, PCA, hypothesis testing, and more. In your role, you will act as the primary statistical advisor to cross-functional teams, present statistical findings clearly to stakeholders, recommend appropriate methodologies based on study design, and support data governance initiatives. Furthermore, you will mentor junior statisticians and platform developers, stay updated with statistical methodologies, and contribute to thought leadership within the organization. To qualify for this position, you should hold a Ph.D. or Master's degree in Biostatistics, Statistics, Applied Mathematics, or a related field. You must have over 8 years of experience in applied biostatistics, including proficiency in developing R Shiny applications and statistical platforms. Strong communication, collaboration, and documentation skills are essential, along with familiarity with GxP guidelines and regulatory considerations in clinical or manufacturing analytics. Join our team at Aventior, a leading provider of innovative technology solutions for businesses worldwide. We leverage cutting-edge technologies like AI, ML Ops, and DevOps to address complex business challenges and foster growth for our clients. Our comprehensive data development and management services cater to various needs, ensuring customized solutions that align with each client's requirements. If you are eager to make a significant impact in the field of biostatistics and analytics, we welcome you to apply for this rewarding opportunity.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
The Marketing India team at Pega is a cross-functionally integrated group that provides support to Go-To-Market teams worldwide. As a Data Governance Lead in the Digital Marketing practice, you will collaborate with various departments and internal stakeholders to ensure that the marketing strategies and tactics drive growth and impact effectively. Your responsibilities will include overseeing data enrichment and acquisition efforts by implementing suitable tools and technologies to enhance data quality and completeness. You will establish and uphold robust data quality standards and processes to maintain data accuracy and consistency across all systems and touchpoints. Leading the Contact governance team, you will offer guidance and mentorship while ensuring alignment with the overall Go to Market objectives. In this role, you will critically evaluate existing processes and automate manual steps where possible. You will devise strategies to enrich contact data with additional information like demographic details, firmographics, and engagement metrics. Collaboration with cross-functional teams to synchronize marketing initiatives with overarching business goals will be a key aspect of your work. Designing, maintaining, and optimizing databases and data systems to ensure data integrity and accuracy will also be part of your responsibilities. Regular reporting and analysis on contact data quality, performance metrics, key insights, and trends will be essential for informed decision-making within the organization. Providing insights on contact quality and guiding the Go to Market team to leverage data effectively for enhanced efficiency will be crucial. Identifying opportunities to enhance operational efficiency by streamlining processes, automating tasks, and optimizing workflows will be part of your continuous improvement efforts. Your role will also involve proactively addressing data quality issues by implementing monitoring tools and maintaining data governance policies and procedures to safeguard data privacy and security. Requirements: - Bachelor's degree in engineering, computer science, or Data science - 8+ years of experience in data management, data governance, or a related field - Proven experience in leading data management initiatives and teams - Strong understanding of data governance frameworks, Data Profiling, and data architecture - Proficiency in data analysis tools like SQL, Python (coding on any cloud environment), and Excel - Experience with data consumption using different APIs - Knowledge of data visualization tools such as Power BI, Tableau, and Gitlab is a plus - Strong analytical and problem-solving skills - Ability to grasp new concepts and technology quickly and communicate effectively with both business and IT stakeholders - Excellent oral and written communication skills, with the ability to collaborate effectively with business and technology leaders Pega Offers: - Gartner Analyst acclaimed technology leadership across product categories - Continuous learning and development opportunities - An innovative, inclusive, agile, flexible, and enjoyable work environment,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a PIM Architect at our client's organization, you will play a crucial role in leading the design and optimization of the Product Information Management (PIM) system. Your responsibilities will include ensuring that the architecture and integrations of the PIM system align with the business objectives, drive operational efficiencies, and enhance product data management. You will be responsible for shaping the technical strategy and overseeing the overall structure of the PIM ecosystem. Your key responsibilities as a PIM Architect will include leading the architecture and strategy of the PIM system to meet business requirements and data management needs. You will design and implement scalable, flexible, and efficient PIM systems that seamlessly integrate with other platforms such as eCommerce, ERP, CMS, and external data sources. Additionally, you will define data models, taxonomies, workflows, and processes to ensure high-quality and consistent product information across the business ecosystem. You will collaborate closely with business stakeholders to understand their product data requirements and develop solutions to enhance the data management process. Providing technical leadership and guidance on PIM system configuration, data governance, and quality standards will be a key aspect of your role. You will also be responsible for developing and promoting best practices for PIM implementation, governance, and lifecycle management. To succeed in this role, you should have at least 5 years of experience in PIM architecture, design, and strategy, with expertise in platforms like Akeneo, InRiver, or similar systems. You should have proven experience in leading PIM system design and implementation, integrating PIM with eCommerce, ERP, CMS, and other business platforms. Strong knowledge of data modeling, taxonomy design, workflow optimization, data governance, and data quality standards within the PIM ecosystem is essential. Proficiency in cloud technologies, scalability, and performance optimization for PIM systems is required. You should also possess strong leadership and communication skills to work effectively with cross-functional teams and stakeholders. Experience in managing PIM system performance, troubleshooting, and ensuring smooth operation at scale is crucial. Additionally, knowledge of agile methodologies and familiarity with project management tools such as Jira and Confluence will be beneficial. This role offers the opportunity to stay current with emerging PIM trends, technologies, and best practices to ensure the long-term effectiveness of the systems. If you are a strategic thinker with a passion for optimizing product data management processes, we encourage you to apply for this exciting opportunity.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
The Universal Data Catalog (UDC) / Visa Data Catalog (VDC) is a comprehensive metadata management platform designed to provide a centralized repository for all data-related information across the organization. The platform enables efficient data discovery, governance, and utilization by offering detailed metadata definitions for tables, columns, and other data assets. It supports various business units by improving data accessibility, enhancing data quality, and facilitating compliance with data governance policies. Our implementation leverages the open-source Datahub project, ensuring a robust, flexible, and scalable solution. Key Responsibilities Develop and maintain the Enterprise Data Catalog / Visa Data Catalog platform, ensuring it meets the organization's evolving needs. Implement and manage metadata ingestion processes to ensure the catalog is up-to-date with the latest data definitions and business context. Collaborate with data stewards, data owners, and other stakeholders to enrich metadata with business definitions, data lineage, and usage context. Enhance the catalog's search and discovery capabilities to provide users with intuitive and efficient access to data assets. Integrate the data catalog with other enterprise systems and tools to support data governance, data quality, and analytics initiatives. Monitor and optimize the performance of the data catalog to ensure scalability and reliability. Provide training and support to users on how to effectively leverage the data catalog for their data-related needs. Actively collaborate with the open-source community to contribute to and leverage the latest advancements in the Datahub project. Analyze industry best practices and keep the catalog functionality up-to-date with feature sets provided by the market, while focusing on Visa's scalability requirements. Champion the adoption of open infrastructure solutions that are fit for purpose while keeping technology relevant. Spend 70% of the time writing code in different languages, frameworks, and technology stacks. Qualifications Basic Qualification 12 to 15 yrs+ of experience in architecture design and development of large-scale data management platforms and data application with simple solutions. Must have extensive hands-on coding and designing skills on Java/Python for backend, MVC (model-view-controller) for end-to-end development, SQL/NoSQL technology. Familiar with Databases like Oracle, DB2, SQL Server, etc. Web Services (REST/ SOAP/gRPC), React/Angular for front-end (UI front-end nice to have). Expertise in design and management of complex data structures and data processes. Expertise in efficiently leveraging the power of distributed big data systems, including but not limited to Hadoop Hive, Spark, Kafka streaming, etc. Deep knowledge and hands-on experience on big data and cloud computing technologies. Strong service architecture and development experience with high performance and scalability. Technical background in data with deep understanding of issues in multiple areas such as data acquisition, ingestion and processing, data management, distributed processing, and high availability is required. Strong on driving for results and self-motivated, strong learning mindset, with good understanding of related advanced/new technology. Keep up with the technology development in the related areas in the industry, which could be leveraged to enhance current architectures and build durable new ones. Bachelor's degree in Computer Science or related technical discipline required. Advanced degree is a plus. Strong leadership and team player. Strong skills on mentoring/growing junior people. Preferred Qualification Experience with ETL / ELT tools / applications. Experience with Apache NiFi and Apache Spark for processing large data sets. Experience on Elastic Search. Knowledge on Data Catalog tools. Experience in building Data Pipeline development tools. Experience with Data Governance and Data Quality tools.,
Posted 1 week ago
4.0 - 12.0 years
0 Lacs
maharashtra
On-site
As an Azure Data Engineer specializing in Microsoft Fabric (Data Lake) based in Mumbai, you should have a minimum of 4 years of experience in the field, with at least 2 years dedicated to working with Microsoft Fabric technologies. Your expertise in Azure services is key, specifically in Data Lake, Synapse Analytics, Data Factory, Azure Storage, and Azure SQL. Your responsibilities will involve data modeling, ETL/ELT processes, and data integration patterns. It is essential to have experience in Power BI integration for effective data visualization. Proficiency in SQL, Python, or PySpark for data transformations is required for this role. A solid understanding of data governance, security, and compliance in cloud environments is also necessary. Previous experience working in Agile/Scrum environments is a plus. Strong problem-solving skills and the ability to work both independently and collaboratively within a team are crucial for success in this position.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You will be responsible for overseeing all master data governance and operations for the SAP systems and other applications/landscapes within the organization. Your role will involve designing, executing, and overseeing all Master Data Management (MDM) related processes to contribute to the organization's process and technology evolution. Your key responsibilities will include implementing and maintaining data governance policies, standards, and processes for SAP Master Data Management to ensure data consistency, accuracy, and compliance. You will also be required to monitor, measure, and improve the quality of master data by identifying inconsistencies, duplicates, inaccuracies, and taking corrective actions as needed. Furthermore, you will oversee and execute master data creation, updates, and deletions in SAP for various data domains such as Customer, Vendor, Material, BOM, Pricing, and others. Collaboration with business stakeholders and cross-functional teams, such as Finance, Supply Chain, and Procurement, will be essential to ensure that master data supports their needs and adheres to SAP best practices. In addition, you will be responsible for identifying and implementing improvements to SAP MDM processes, leveraging automation and best practices to enhance data accuracy and efficiency. You will also conduct user training on data management best practices within SAP and provide ongoing support to ensure data standards are maintained. Your role will involve supporting data integration efforts related to mergers, acquisitions, or system upgrades by ensuring seamless data migration and system compatibility. A minimum of 8-12 years of experience in SAP Master Data Management or a related role within SAP is required for this position. The ideal candidate should possess excellent communication and interpersonal skills, with the ability to effectively engage with stakeholders at all levels of the organization. An analytical mindset is crucial, along with the ability to translate data into actionable insights and recommendations. Hands-on experience in SAP modules such as MM, SD, FI, PP/QM, or relevant SAP S/4HANA MDM experience, as well as relevant certifications, would be desirable. Knowledge of data governance tools and practices, experience with data quality monitoring and reporting tools, and a basic understanding of SQL or other database management tools are also preferred qualifications for this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As an Email Marketing & Automation Specialist, you will play a crucial role in managing end-to-end campaign execution for one of our key B2B SaaS clients. Your primary responsibility will involve working closely with the client's marketing team to drive engagement, generate qualified leads, and impact revenue through Pardot and Salesforce CRM. Your day-to-day tasks will include building, testing, and executing email campaigns in Pardot, ensuring optimization for responsiveness, engagement, and deliverability. You will also be responsible for creating and managing static and dynamic lists, developing segmentation rules, setting up automation rules, and managing engagement studio programs to ensure scalable and efficient campaigns. Data governance and compliance will be another key aspect of your role, where you will maintain clean prospect data, ensure compliance with GDPR, CAN-SPAM, and other data privacy regulations, as well as monitor syncs between Pardot and Salesforce to troubleshoot and fix any errors. Additionally, you will be involved in building and optimizing Pardot forms, form handlers, landing pages, and using custom redirects for analytics and attribution. Your expertise will also be required in creating reports to track performance, sharing insights on email metrics, engagement, and MQL generation, and setting up lead scoring models and grading criteria to prioritize sales-ready prospects. To excel in this role, you should have at least 4 years of experience in Email Marketing, Pardot, and Salesforce CRM within a B2B marketing environment. Strong communication skills, a deep understanding of email campaign best practices, database hygiene, lead nurturing, and familiarity with data privacy regulations are essential. Your ability to work independently, collaborate effectively, and act as a go-to resource for the client will be crucial for success. If you possess a Pardot certification, it will be considered a significant advantage. Join us in this exciting opportunity to make a real impact on our client's marketing efforts and drive success through effective email marketing and automation strategies.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Consultant with a focus on Data at AIONEERS-EFESO, an integral part of EFESO Management Consultants, you will play a crucial role in optimizing supply chains and achieving best-in-class standards for businesses. Your responsibilities will involve collaborating on customer and internal projects, specifically focusing on Data for Supply Chain Planning and Supply Chain Analytics. This will encompass tasks such as analyzing data, implementing data governance and management practices, and standardizing processes to enhance supply chain efficiency. Throughout projects, your main focus will be on developing master data models, monitoring the effectiveness of data quality enhancements, and presenting these improvements through dashboards in Power BI. Acting as the liaison between business and IT, you will contribute to the design, implementation, and integration of data into Advanced Planning Systems (APS), sourced from ERP systems like SAP, Oracle, or Microsoft Dynamics. Your role will also entail leading subprojects independently, guiding colleagues and stakeholders in defining and executing data architectures, and implementing data visualization tools to communicate analytical findings effectively. To excel in this position, you should possess at least 3 years of professional experience in consulting, analytics, Master Data Management, or Supply Chain Management. Strong expertise in SAP, Power BI, or similar analytics tools is essential, along with a relevant degree in fields like Supply Chain, IT, industrial engineering, or business administration. Your capabilities should extend to analyzing, building, and maintaining various data structures, ensuring seamless integration with Supply Chain Planning Systems like BY, o9, and Kinaxis. Proficiency in BI tools such as Power BI, QlikView, Qlik Sense, or Tableau is required for effective data visualization and reporting. Furthermore, experience in software implementation projects and familiarity with the Scrum methodology are advantageous. Proficiency in MS Office products, particularly MS Excel and PowerPoint, is necessary. Knowledge of end-to-end Supply Chain Planning processes, with a focus on data quality, governance, and standardization, would be a significant asset. Fluency in English is a must, with additional language skills in German, French, or Italian being beneficial. A keen interest in new technologies and digital solutions, coupled with strong interpersonal and communication skills, will further enhance your suitability for this role. At AIONEERS-EFESO, you will have the opportunity to become a thought leader in digital supply chain transformation. The company offers a conducive team culture, flexible work hours, respect for your ideas, open discussions, attractive remuneration, paid maternity and paternity leave, comprehensive insurance plans, sponsored certifications in relevant technology areas, and an office located at a prime location in Mannheim. The focus is on your results rather than hours worked, providing you with the chance to actively contribute to innovative business strategies on a global scale. Join us in reshaping supply chain management and crafting your own success story.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
You will be responsible for system configuration and implementation, including conducting detailed analysis of business requirements and translating them into effective SAP MM/WM solutions. You will configure the SAP MM/WM module to meet the specific needs of the organization, which includes setting up material master data, procurement processes, and inventory management. Your role will involve ensuring the smooth integration of SAP MM with other SAP modules such as SD, PP, FI/CO, and SAP WM. You will oversee the full lifecycle of SAP MM/WM implementation projects from initial scoping to go-live and post-implementation support. Additionally, you will evaluate existing business processes and identify opportunities for improvement using SAP MM/WM functionalities. Collaborating with stakeholders, you will design and implement optimized procurement, inventory management processes, and Warehouse Management processes. You will provide recommendations on best practices and assist the organization in adopting them to maximize the benefits of SAP MM/WM. Ensuring the accuracy and integrity of material master data and other related data within the SAP MM/WM system will be crucial. As part of your responsibilities, you will develop and generate reports to provide insights into procurement activities, inventory levels, material requirements, and warehouse Management. Furthermore, you will implement tools and processes for effective data governance and compliance with industry standards. Join us in shaping a future where technology seamlessly aligns with purpose at Birlasoft.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
The Data Mining Visualization role is a crucial position in our organization, focused on interpreting and utilizing extensive data sets. You will be instrumental in converting raw data into insightful visualizations that drive strategic decision-making. By applying advanced data mining techniques and visualization tools, you will assist various departments in comprehending complex data, uncovering patterns, and trends that contribute to business expansion. An ideal candidate will possess a strong analytical mindset and technical expertise in data analysis and visualization software. As a key member of the data analytics team, you will not only enrich data insights but also promote a data-centric culture, enabling stakeholders to make well-informed decisions based on clear visual representations of data. You will be responsible for: - Employing data mining techniques to extract valuable insights from large data sets. - Creating and executing data visualization strategies that effectively communicate insights. - Utilizing visual analytics tools like Tableau, Power BI, or similar programs. - Collaborating with cross-functional teams to gather data requirements and inputs. - Developing interactive dashboards to monitor KPIs and business performance metrics. - Analyzing and interpreting intricate data sets to identify trends and patterns. - Presenting findings and visualizations to stakeholders in a concise manner. - Ensuring data accuracy and integrity throughout the data processing workflow. - Designing and maintaining a scalable data visualization infrastructure. - Providing training and support to staff on data visualization tools and techniques. - Generating detailed reports summarizing analysis results and recommendations. - Keeping abreast of the latest data visualization trends and tools. - Performing quality control checks on visualizations and analytics outputs. - Assisting in developing data mining models to predict against business metrics. - Documenting processes and maintaining records of data sources and methodologies. Required Qualifications: - Bachelor's degree in Data Science, Computer Science, Statistics, or a related field. - Minimum of 3 years of experience in data mining and visualization. - Proficiency in data visualization tools such as Tableau, Power BI, or D3.js. - Strong skills in SQL and database management systems. - Experience with programming languages like Python or R. - Knowledge of statistical analysis methods and techniques. - Excellent problem-solving and analytical skills. - Ability to communicate complex data insights to non-technical stakeholders. - Familiarity with machine learning concepts and techniques. - Experience in creating automated reporting solutions. - Strong organizational skills and attention to detail. - Ability to work independently and collaboratively in a team environment. - Proven track record of managing multiple projects simultaneously. - Understanding of business intelligence concepts. - Commitment to continuous learning and professional development. - Experience with data governance and quality assurance practices.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
We are searching for an experienced and highly motivated Sales Operations Manager to join our team. As a Sales Operations Manager, you will be responsible for managing and maintaining the sales data infrastructure, ensuring data accuracy and consistency. The role demands high attention to detail and the ability to efficiently manage and control CRM data. The ideal candidate should possess experience in handling large datasets and maintaining data quality standards to enhance the performance of sales processes. Lead and manage the Sales operation team to guarantee data accuracy, integrity, and compliance with organizational standards. Maintain and control CRM data, ensuring it is regularly updated, accurate, and reflective of ongoing sales activities. Conduct routine data audits and quality checks on the CRM and other sales data systems to identify and resolve inconsistencies or discrepancies. Collaborate with sales teams to ensure proper integration and utilization of sales data across systems. Develop and implement data governance policies to uphold high data quality standards. Generate sales performance reports and dashboards using data visualization tools like Power BI or Tableau to aid decision-making. Analyze sales data to identify trends, patterns, and opportunities for optimizing the sales pipeline. Work closely with IT and business teams to ensure proper data management practices are maintained across platforms. Provide training and support to the sales team on CRM best practices and data management. Optimize data processes for improved performance and scalability as the sales function expands. Ensure compliance with data security and privacy regulations related to sales data. Assist in data-related projects, including migrations, upgrades, and system implementations. Bachelors degree in Information Systems, Sales Operations, or a related field is required. 6+ years of experience in CRM data administration or similar roles. Strong understanding of CRM systems and sales data management tools (e.g., Salesforce, Workday, HubSpot). Proficiency in Excel is essential. Exceptional attention to detail and ability to maintain high data quality standards. Proficiency in SQL and data visualization tools like Power BI or Tableau is a plus. Experience with data governance, data audits, and maintaining data accuracy. Strong communication skills with the ability to collaborate across teams. Knowledge of data security and privacy regulations relevant to managing sales data. Makse Group is an expert team of experienced consultants, managers, and advisors with a strong passion for supporting the Workday platform and adjacent business functions. The company headquarters are in Dallas with satellite offices in Denver and Gurugram. Visit our website at www.maksegroup.com,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for developing, maintaining, and optimizing reports and dashboards using SAP Business Objects, SAP BW, and Power BI. Collaborating with business users, you will understand their data needs and translate them into functional BI solutions. Extracting, transforming, and loading data from SAP and other sources will be essential to build efficient and scalable data models. Your role will ensure data accuracy, consistency, and integrity across BI platforms. Analyzing business processes and providing insights using SAP functional knowledge will be a key aspect of your responsibilities. Additionally, you will develop and maintain SQL queries, stored procedures, and data validation processes. Supporting ad-hoc reporting and business analytics needs and collaborating with IT and business teams for seamless integration of BI solutions are also part of this role. Monitoring BI system performance and troubleshooting issues as needed and training end-users on BI tools and best practices will be critical to your success. To excel in this role, you must possess a Bachelor's degree in computer science, Information Systems, Business Analytics, or a related field. You should have 5-7 years of experience as a BI Analyst or in a similar role. Hands-on experience with SAP Business Objects (BOBJ), SAP BW (Business Warehouse), and Power BI is mandatory. A strong understanding of SAP functional modules (e.g., Finance, Supply Chain, Sales & Distribution, HR, etc.) is required. Proficiency in SQL and data modeling, as well as experience with ETL processes and data warehousing concepts, are essential qualifications. Desirable qualifications include a Power BI certification. Your technical, business, and leadership skills will play a crucial role in this position. You should have the ability to analyze complex datasets and provide actionable insights. Being highly skilled in communicating business data with a focus on UI/UX and intelligent dashboard design is important. A solid understanding of leading and contemporary practices and capabilities in information management, data governance, reporting, and analytics is necessary. Experience in production support/BAU, working knowledge of change control processes and impacts, a high degree of problem-solving and technical skills, and the ability to evaluate and prioritize tasks are also required. Involvement in a mixture of new application development, maintenance, and technical support, as well as effectively liaising with internal customers at all levels within the organization and external parties when necessary, are key aspects of this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
At Cyanous, our mission is to empower every person and organization to achieve more, driving transformation across businesses worldwide. We engage with employees, clients, partners, and community organizations globally to address global challenges with global solutions. Cyanous is a leading global IT, consulting, and business process services company leveraging cognitive computing, automation, cloud, analytics, and emerging technologies. This is a full-time role for an Adobe RT-CDP and ETL Solution Expert located on-site in Mumbai. As an expert in Adobe Real-Time Customer Data Platform (RT-CDP) and Extract, Transform, Load (ETL) solutions, you will be responsible for day-to-day tasks to help clients adapt to the digital world and succeed. Your qualifications should include expertise in Radiation Safety and Radiologic Technology, possession of American Registry of Radiologic Technologists (ARRT) certification, a strong background in Radiology and Medicine, and experience with Adobe RT-CDP and ETL solutions. In addition, you should have excellent problem-solving and analytical skills, the ability to work collaboratively in a team, strong communication and documentation skills, and a Bachelor's degree in a relevant field. For this role, you should have 5+ years of hands-on experience with Adobe Real-Time CDP, Adobe Campaign Classic with GCP, and ETL solutions. You must possess a strong understanding of customer data management, data modeling, and data governance principles, with a proven track record of successfully implementing and maintaining large-scale data ETL projects. Excellent problem-solving skills are essential, with the ability to troubleshoot complex technical issues. Strong communication and collaboration skills are necessary, along with experience working with cross-functional teams. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. Experience in developing services using various resources of GCP (Cloud Run, PubSub, Load Balancer, etc.), log/issue analysis using Cloud Logging, intermediate or higher level of Python development skills, ability to analyze written Python programs, develop APIs using various Python libraries, understanding of Airflow, reworking/analyzing errors, writing Airflow DAGs, using Airflow UI, analyzing Airflow task execution logs and reworking them, using BigQuery, understanding the operation principles of BigQuery, analyzing SQL statements, and identifying issues are all important skills for this position. Being an Adobe Certified Expert in Campaign Classic and/or Real-Time CDP is a plus. Migration from Adobe Campaign Classic from v7 to v8 and implementing IP warming strategies for optimal email deliverability and reducing spam filters will be added advantages.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are a talented Lead Data Product Architect and Engineer responsible for designing, developing, and maintaining data solutions across APAC Advisory products. Your role involves ensuring data quality, security, scalability, and performance of the data products while collaborating with various stakeholders to understand business requirements and translate them into data models and architectures. Your responsibilities include defining a robust and scalable data architecture, facilitating the design and development of high-quality data resources, collaborating with other teams for implementation of data models and flows, communicating data methodology and results to stakeholders, and integrating various data sources with internal and external platforms. You will also be responsible for ensuring data quality, accuracy, and consistency across APAC Advisory products, monitoring and troubleshooting data issues, supporting data governance initiatives, providing data expertise to business stakeholders, and staying updated with industry trends related to data architecture, big data, and cloud solutions. As a Lead Data Product Architect and Engineer, you will provide support for applications and products on digital platforms, develop comprehensive data architecture and design elements, evaluate and recommend technologies for Data Lake development, and create a data marketplace and data dictionary. You will work with business stakeholders to gather requirements, design efficient data models, support analytics and reporting needs, and ensure data models support operational requirements. In addition, you will be responsible for data migration, integration, testing, and validation planning, producing responsive design solutions, partnering in delivering data governance best practices, and ensuring data quality, security, scalability, and performance of data systems and products. Collaboration with product managers, developers, analysts, and stakeholders to translate business requirements into data models and architectures is also a key aspect of your role. To be successful in this role, you should have a Bachelor's degree in Computer Science or related field, 5+ years of experience in managing data platforms and architecture, proficiency in data modeling, ETL processes, SQL, and big data technologies, and knowledge of data integration techniques and governance frameworks. Experience with cloud platforms and application development frameworks is highly desirable, along with strong communication, collaboration, and problem-solving skills. Joining Cushman & Wakefield will offer you the opportunity to be part of a global real estate services firm committed to career development, diversity, and inclusion. You will benefit from a growing company, career progression opportunities, and a flexible work environment focused on technology and autonomy. Continuous learning and development opportunities, as well as a comprehensive employee benefits program, are also part of the work culture at Cushman & Wakefield.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a member of KKR's Data Engineering team, your role will involve executing against a robust data engineering book of work while establishing scalable and repeatable processes, best practices, and frameworks. You will be an integral part of KKR's enterprise data group, working towards collecting, managing, and harnessing the power of data across the diverse portfolio investments. Your primary objectives will revolve around data model design and the technical implementation and curation of consumable data sets, also known as Data Products. This function is crucial in ensuring the success of our new Data Engineering Vertical and contributing to our firm's data-driven approach. The ideal candidate for this role will have a strong foundation in data architecture, analytics, and engineering, with a proven track record of developing scalable data solutions. Expertise in Snowflake is essential, as you will be required to have advanced knowledge of its cloud data platform for seamless integration and management of data processes. Moreover, substantial experience in the financial services sector is necessary, showcasing a deep understanding of financial data and asset management. Your responsibilities will include evaluating the backlog of requested Data Products, designing data models that meet business user needs, developing Data Products using advanced SQL data concepts in Snowflake, implementing state-of-the-art tooling for version control, creating data pipelines, and designing custom databases, tables, and views within Snowflake. Additionally, you will need to collaborate with engineering and IT teams to achieve operational excellence through automation, maintain stakeholder relationships, and ensure best-in-class data governance practices. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, have at least 6 years of data engineering experience, including experience in curated data sets and building Data Products. Leadership experience in managing a team of Data Engineers is preferred. Proficiency in Excel, advanced SQL skills, and experience in Python or other programming languages are mandatory. Additionally, you should demonstrate attention to detail, initiative, strong work ethic, delivery excellence, accountability, teamwork orientation, integrity, and professionalism. Excellent written, verbal, and interpersonal communication skills are required, along with the ability to build strong relationships with colleagues locally and globally. If you are looking to be part of a dynamic team that is at the forefront of data engineering and innovation within the financial services sector, this role at KKR is an exciting opportunity to grow your career and make a significant impact.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an MDM Solution Design and Implementation Specialist, you will be responsible for designing, developing, and implementing Informatica MDM solutions on IDMC to meet business requirements. This includes configuring MDM hubs, data modeling, and defining matching and cleansing rules within the cloud platform. Additionally, you will integrate IDMC MDM with other cloud and on-premise systems such as ERP and CRM using Informatica Cloud Data Integration and Cloud Application Integration services. You will be involved in designing and implementing cloud-based data models for master data entities like Customer, Product, and Supplier within IDMC, ensuring alignment with data governance and quality standards. Leveraging Informatica Data Quality features within IDMC, you will enforce data consistency, cleansing, and validation while implementing data governance policies to ensure compliance. Customization and workflow automation are key aspects of your role, where you will customize Informatica MDM workflows and processes in IDMC for efficient master data management. Automation of data ingestion, validation, and quality processes using IDMC's low-code capabilities will also be a part of your responsibilities. You will conduct testing, debugging, and troubleshooting of MDM solutions to ensure they meet requirements and operate seamlessly within the cloud environment. Furthermore, you will develop documentation for MDM configurations, workflows, and data models, along with generating reports on data quality, governance, and MDM performance. Key Requirements: - 3-5 years of hands-on experience with Informatica MDM and Informatica Data Management Cloud (IDMC), including configuration, customization, and integration. - Strong knowledge of cloud-based MDM implementations, particularly using IDMC tools like Cloud Data Integration, Data Quality, and Master Data Management Hub. Technical Skills: - Proficiency in SQL, Java, RESTful APIs, and Web Services for cloud-based integration. - Familiarity with Informatica Cloud Data Integration and the use of IDMC for data migration, transformation, and orchestration. This position offers a competitive salary in the industry and is available for immediate start in Pune, Bangalore, or Gurugram.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking an Ataccama Admin to become a valuable part of our team, assisting in the management and upkeep of our Ataccama data quality and data governance platform. Your responsibilities will include the installation, configuration, and maintenance of Ataccama on the AWS/Azure platform. Additionally, you will play a key role in developing and executing data quality rules and policies. A strong grasp of Ataccama architecture and best practices is essential, along with prior experience in data management and data governance. Familiarity with Collibra and Immuta administration would be advantageous, and proficiency in overseeing VMs, as well as both Windows and Linux systems, is required. Previous experience in the pharmaceutical sector is preferred. In this role, your duties will encompass the development and enforcement of data quality rules and policies, monitoring and reporting on data quality metrics, troubleshooting and resolving Ataccama-related issues, and staying informed on the latest features and best practices of Ataccama. Collaborating with cross-functional teams to implement data governance policies and procedures will be a key aspect of your responsibilities. You will also be tasked with managing and maintaining VMs and systems based on Windows and Linux platforms, overseeing redundancy, backup, and recovery plans and processes, and demonstrating a robust knowledge of AWS/Azure. The ideal candidate should possess a minimum of 4 years of experience working with Ataccama, along with a background in data management and data governance. Prior experience in administering Collibra and Immuta would be advantageous. Proficiency in managing VMs, Windows, and Linux systems, as well as experience in Performance Tuning, are essential requirements. Strong analytical and problem-solving skills, excellent communication, and the ability to work effectively in a team are all qualities we are looking for in potential candidates. A crucial requirement for this role is proficiency in Ataccama ONE administration, demonstrating your ability to effectively manage the Ataccama environment.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
maharashtra
On-site
As a Records Management Sr. Group Manager at our organization, you will play a vital role in establishing and implementing enterprise Records Management Policy and Standards. Your responsibilities will include leading the planning and implementation of Global Records Management Office Standards and advising all Businesses, Functions, and Regions on procedures related to Records Management. Your impact will be reflected in the size of the team managed, strategic influence on the business, and interaction with other functions or businesses. You will be involved in various activities that require excellent communication skills, both internally and externally, often at a senior level. Your key responsibilities will include full management responsibility of a team or multiple teams, including managing people, performance evaluation, compensation, hiring, disciplinary actions/terminations, and budget approval. You will drive and execute the Global Records Management Office strategy, particularly focusing on the identification, inventory, and lifecycle management of records. Developing and maintaining strong partnerships with internal and external audit groups, Business and Operational leadership, Global stakeholders, Regional/Global Technology, Legal, and Archiving teams will be essential. You will establish oversight of the Records Management Standards and coordinate with regional and local teams to ensure adherence to Records Management Standards enterprise-wide. Furthermore, you will be responsible for establishing and implementing processes and procedures for governing and adopting Citi record-keeping policies in compliance with relevant regulations and business needs. Your expertise will be crucial in advising on Operational activities such as regional strategies, divestitures, third parties with records, legal/tax holds, and archiving. Monitoring relevant metrics to demonstrate the effectiveness and adherence of the Records Management Program, assessing risks, driving compliance with applicable laws and regulations, and supervising the activity of others to maintain standards will be part of your role. To qualify for this position, you should have 15+ years of experience, including 10+ years of managerial experience in a complex financial organization, with a particular focus in the data space highly preferred. Experience in Records, Data, and Information Governance space is also highly preferred. Strong communication, collaboration, negotiation, influencing, stakeholder management, organizational savvy, and governance skills are required. Additionally, proven leadership skills, ability to lead change, and drive consensus among various audiences are essential. A Bachelor's/University degree is required, and a Master's degree is preferred for this role. As a Technology Records Management Lead, you will have oversight of all Records Management activities within the Technology function. Your responsibilities will include managing the organizational structure, responsibilities, and staffing levels of Record Management Units (RMUs), ensuring policy compliance, analyzing metrics, managing issues, and overseeing control testing and monitoring. Join us in our Data Governance team and be a part of our dynamic environment where your expertise in Records Management will make a significant impact.,
Posted 1 week ago
5.0 - 9.0 years
7 - 11 Lacs
mumbai
Work from Office
*Statistical Expertise: Advanced knowledge of statistical methods, including linear/non-linear modelling, hypothesis testing, and Bayesian techniques. *AI/ML Integration: Strong skills in applying AI/ML algorithms (e.g., neural networks, random forest, anomaly detection) for data quality checks and predictive analysis. Experience with cloud-based environments (AWS, Azure, etc.). *Quantitative Finance: Deep understanding of financial instruments, market data, and the use of quantitative methods in portfolio management and risk analysis. *Programming Skills: Proficiency in statistical programming languages (Python, R, SQL) and experience with tools like MATLAB, SAS, or similar platforms. *Automation: Experience in developing and implementing automated data validation processes, including real-time monitoring and alert systems. *Data Governance: Strong knowledge of data management principles, regulatory compliance, and data governance practices, particularly in the context of financial services. *Leadership & Mentorship: Ability to mentor and guide junior team members, sharing expertise in statistical analysis, AI/ML, and data quality best practices. *Problem-Solving: Excellent analytical skills to identify root causes of data quality issues and implement long-term solutions. *Collaboration: Strong ability to work with cross-functional teams, including data scientists, engineers, and financial experts, to enhance overall data quality. *Statistical Expertise: Advanced knowledge of statistical methods, including linear/non-linear modelling, hypothesis testing, and Bayesian techniques. *AI/ML Integration: Strong skills in applying AI/ML algorithms (e.g., neural networks, random forest, anomaly detection) for data quality checks and predictive analysis. Experience with cloud-based environments (AWS, Azure, etc.). *Quantitative Finance: Deep understanding of financial instruments, market data, and the use of quantitative methods in portfolio management and risk analysis. *Programming Skills: Proficiency in statistical programming languages (Python, R, SQL) and experience with tools like MATLAB, SAS, or similar platforms. *Automation: Experience in developing and implementing automated data validation processes, including real-time monitoring and alert systems. *Data Governance: Strong knowledge of data management principles, regulatory compliance, and data governance practices, particularly in the context of financial services. *Leadership & Mentorship: Ability to mentor and guide junior team members, sharing expertise in statistical analysis, AI/ML, and data quality best practices. *Problem-Solving: Excellent analytical skills to identify root causes of data quality issues and implement long-term solutions. *Collaboration: Strong ability to work with cross-functional teams, including data scientists, engineers, and financial experts, to enhance overall data quality. Morningstar is an equal opportunity employer
Posted 1 week ago
3.0 - 8.0 years
11 - 21 Lacs
pune
Work from Office
Job Title: BI Consultant Company Name: NCSi Job Description: As a BI Consultant at NCSi, you will be responsible for designing, developing, and implementing business intelligence solutions that empower our clients to make data-driven decisions. You will work closely with stakeholders to gather requirements, analyze data, and create insightful reports and dashboards. Your role will involve analyzing current systems and processes, identifying areas for improvement, and recommending solutions that align with business objectives. You will also provide training and support to end-users to ensure they can effectively utilize BI tools. Key Responsibilities: - Collaborate with clients to understand their data and reporting needs. - Design and develop BI solutions, including dashboards, reports, and data models. - Analyze large datasets to identify trends and actionable insights. - Ensure data integrity and accuracy by implementing best practices in data governance. - Create documentation for BI processes and solutions for future reference. - Conduct workshops and training sessions for end-users on BI tools and dashboards. - Stay current with industry trends and technologies to enhance BI capabilities. Skills and Tools Required: - Proficiency in BI tools such as Tableau, Power BI, or QlikView. - Strong SQL skills for data querying and manipulation. - Experience with data warehousing concepts and ETL processes. - Knowledge of data visualization best practices. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills to interact with stakeholders. - Ability to work independently and in a team environment. - Familiarity with cloud-based BI solutions and technologies is a plus. - Understanding of data governance and data quality principles. Preferred Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Relevant certifications in BI tools or data analytics. - 3+ years of experience in a BI or data analysis role. NCSi offers a collaborative work environment where innovation is encouraged, and your contributions can significantly impact our clients' success. If you have a passion for data analytics and a drive to help organizations leverage their data, we invite you to apply for this exciting opportunity. Roles and Responsibilities About the Role: As a BI Consultant at NCSi, you will play a key role in transforming data into actionable insights. You will collaborate closely with clients to understand their business needs and deliver tailored BI solutions. This position requires a blend of technical expertise and strategic thinking to enhance data-driven decision-making. About the Team: You will be part of a dynamic team of data professionals dedicated to providing innovative BI solutions. The team fosters a collaborative environment where knowledge sharing and continuous learning are encouraged. Together, you will work on diverse projects that contribute to the overall success of NCSi and its clients. You are Responsible for: - Analyzing client requirements and translating them into effective BI solutions. - Designing, developing, and implementing data models and dashboards that facilitate decision-making. - Conducting training sessions for clients to ensure they can effectively utilize BI tools and reports. - Maintaining documentation of processes and solutions to ensure best practices are followed. To succeed in this role – you should have the following: - Proven experience in business intelligence and data analytics. - Strong proficiency in BI tools like Tableau, Power BI, or similar platforms. - Excellent analytical skills and ability to work with large data sets. - Effective communication skills to articulate complex data concepts to non-technical stakeholders.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |