Jobs
Interviews

1052 Etl Processes Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will collaborate with stakeholders to gather and understand business requirements for BI solutions. You will design and develop interactive and visually appealing dashboards, reports, and data visualizations using tools such as Qlik, Power BI, Tableau, and other relevant tools. Working closely with data engineers and analysts, you will ensure the availability, accuracy, and timeliness of data for reporting and analytics purposes. You will also be responsible for developing ETL processes from various sources into the data warehouse or BI platforms. Promoting best practices for data visualization, user experience, and data governance within the BI ecosystem will be a key aspect of your role. Your strong presentation skills will be essential in effectively communicating complex technical concepts to non-technical stakeholders. Key Responsibilities: - Proven experience of at least 4 years as a BI Developer or in a similar role - Proficiency in at least two BI tools: Qlik, Power BI, Tableau - Showcase experience in 2-3 complete life cycle implementations of BI projects - Solid understanding of ETL processes and data integration techniques - Advanced SQL skills for querying and manipulating data from relational databases - Familiarity with data warehousing concepts and data modeling principles - Experience in optimizing and fine-tuning BI solutions for performance and efficiency - Excellent analytical and problem-solving skills with a keen eye for detail - Strong communication skills to effectively interact with both technical and non-technical stakeholders - Good documentation skills to deliver BRDs, architecture, specifications document, project plan, etc. - Basic experience of cloud concepts In this role, you will play a crucial part in the successful implementation and maintenance of BI solutions, ensuring that they meet the needs of the business and provide valuable insights for decision-making processes.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Talend ETL Developer at Team Geek Solutions, you will be responsible for designing, developing, and maintaining ETL processes using Talend. Your role will involve implementing data integration solutions, collaborating with business stakeholders to understand data requirements, and optimizing SQL queries for data extraction and manipulation. You will be tasked with ensuring data accuracy and quality through data profiling and analysis, as well as monitoring and troubleshooting ETL jobs to ensure smooth data flow. Additionally, you will be required to maintain documentation for ETL processes and data model designs, work with team members to design and enhance data warehouses, and develop data transformation logic to meet business needs. To excel in this role, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field, and have proven experience as an ETL Developer with expertise in Talend. Your strong understanding of ETL frameworks, data integration principles, and proficiency in writing and troubleshooting SQL queries will be essential. Experience in data modeling, database design, and familiarity with data quality assessment methodologies are also required. Your ability to analyze complex data sets, provide actionable insights, and demonstrate strong problem-solving and analytical skills will be crucial. Excellent communication and interpersonal skills are necessary for collaborating effectively in a team-oriented environment. Knowledge of data warehousing concepts, best practices, and experience with Agile development methodologies will be valuable assets. Your willingness to learn new technologies and methodologies, attention to detail, commitment to delivering high-quality solutions, and ability to manage multiple tasks and deadlines effectively are key attributes for success in this role. Experience with performance tuning and optimization of ETL jobs is a plus. If you are passionate about data warehousing, troubleshooting, ETL processes, workflow management, data modeling, SQL, data profiling and analysis, data governance, data integration, and Agile methodology, and possess the required skills and qualifications, we invite you to join our innovative and collaborative team at Team Geek Solutions in Mumbai or Pune.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

delhi

On-site

As a Data Analyst at Auditorsdesk, a cloud-based e-auditing software company based in Delhi, you will play a crucial role in transforming raw data into actionable insights to drive business success through analytics. Your responsibilities will include collecting and analyzing large datasets to identify trends, patterns, and insights. You will be developing and maintaining data pipelines using Python for efficient data extraction, transformation, and loading (ETL) processes. Collaboration with cross-functional teams to understand business requirements and provide data-driven insights will be a key aspect of your role. Additionally, you will design and implement data visualization solutions to communicate findings and support decision-making. Performing exploratory data analysis to uncover hidden trends and correlations in the data will also be part of your day-to-day tasks. It is essential to stay current with industry trends and best practices in data analytics and Python programming. To excel in this role, you must have proven experience in Python programming and a strong proficiency in using Python libraries for data manipulation and analysis (e.g., Pandas, NumPy, SciPy). Experience with data visualization tools such as Tableau, Power BI, or Matplotlib is required. A solid understanding of statistical concepts and data modeling techniques is also essential. Excellent problem-solving skills, attention to detail, and strong communication skills to effectively convey complex findings to non-technical stakeholders are qualities we are looking for. You should have the ability to work both independently and collaboratively in a fast-paced environment. This is an on-site internship position based in New Delhi with a compensation of INR 15,000 per month. Immediate joiners will be given preference, and only shortlisted candidates will be contacted for interviews. If you are passionate about data analytics, data science, and problem-solving, we encourage you to apply for this exciting opportunity to be part of our dynamic team at Auditorsdesk.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

Join us in our mission to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. As an Analytics Senior Associate, you will be part of the Client & Network Analytics team, providing valuable insights to stakeholders regarding the critical services we offer. Your work will contribute to improving the overall performance of our service delivery, ultimately impacting our clients positively by streamlining processes and enabling them to focus on delivering patient care. Now let's delve into what we are looking for in you: You are a naturally curious team player who enjoys exploring, learning, and leveraging data to drive improvements. Your role will involve framing business problems, analyzing data to identify solutions, and presenting your findings through compelling visualizations. Collaboration with various teams within the organization is key, as you will work closely with partners to ensure a cohesive approach towards achieving our objectives. As part of our team, you will be immersed in a dynamic environment where we prioritize creating visibility into business performance and driving tangible impact for athenahealth and our provider customers. Our approach is agile-based, utilizing cutting-edge technologies such as Snowflake, Tableau, and Sigma to extract, transform, and visualize data effectively. Your responsibilities will include extracting data from diverse sources to enhance visibility into performance metrics, building tools and visualizations to uncover insights, and collaborating with partners to align business objectives with metrics. You will have the autonomy to plan and execute your work with strategic guidance from leadership, focusing on areas such as back-end engineering, analysis/visualization, or leading scrum teams. To thrive in this role, you should hold a Bachelor's degree with a preference for quantitative disciplines such as Computer Science, Data Engineering, or Statistics. With a minimum of 8 years of experience in a data analytics environment, you should possess a strong understanding of business intelligence tools, data warehousing techniques, and relational databases. Proficiency in SQL and additional programming languages like R, Python, or JavaScript is essential, along with experience in data visualization tools such as Tableau. Joining athenahealth means being part of a culture that values innovation, collaboration, and personal growth. Our diverse workforce, or "athenistas", drives our mission forward by bringing unique perspectives and experiences to the table. We strive to create a work environment where every individual feels empowered to contribute their best, supported by a range of benefits, perks, and opportunities for professional development. At athenahealth, we believe in giving back to our community and supporting causes that align with our purpose. Through our athenaGives platform, we aim to make a positive impact on food security, healthcare access, and STEM education. By fostering a culture of learning, inclusivity, and work-life balance, we empower our employees to thrive both personally and professionally. Come be a part of our journey to revolutionize healthcare, where your contributions will be recognized and your career will have room to grow within our dynamic and supportive environment.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Skillsoft is on a mission to propel organizations and individuals towards growth through transformative learning experiences. The company believes in the potential of every team member to be amazing and invites individuals to join in the journey of transforming learning and helping individuals unleash their edge. As a Senior Data Engineer in the Enterprise Data Solutions team, housed within the Digital & Information Technology organization, you will play a crucial role in driving operational excellence and innovation. Collaborating with various functional groups, you will provide expertise and technologies to digitalize, simplify, and scale the company for the future. The ideal candidate for this role will have a solid background in data engineering, data analysis, business intelligence, and data management. Your responsibilities will include managing the ingestion, processing, and storage of data in the Azure Databricks Data Lake and SQL Server data warehouses. The Enterprise Data Solutions team at Skillsoft serves as the data backbone, enabling seamless connections between systems and facilitating data-driven business insights through analytics-ready data sets. The team's mission is to deliver analytics-ready data sets that enhance business insights, drive decision-making, and foster a culture of data-driven innovation while setting a gold standard for process, collaboration, and communication. Key Responsibilities: - Lead the identification of business data requirements, create data models, and design processes aligned with business logic. - Design ETL processes, develop source-to-target mappings, and manage load processes to support regular and ad hoc activities. - Work with open-source tools to build data products for exploring and interacting with complex data sets. - Build robust systems and reusable code modules, ensuring long-term maintenance and support. - Perform testing to guarantee accurate processes, comply with organizational standards, and optimize data structure. - Collaborate with the team through code reviews and technical guidance. - Document data flows and technical designs to ensure compliance with best practices. - Monitor timelines and workload to ensure delivery promises are met. - Support the BI mission through learning new technologies and providing technical guidance to the team. Skills & Qualifications: - Bachelor's degree in a quantitative field. - 5+ years of experience in Data Engineering/Data Management. - Proficiency in Azure Databricks, SQL, and PySpark. - Agile methodology experience. - Cloud migration expertise and interest. - Strong analytical skills and technical abilities. Skillsoft, a leader in online learning and talent solutions, empowers organizations to unlock the potential in their people. With a focus on leadership development, business skills, technology, digital transformation, and compliance, Skillsoft democratizes learning through an intelligent learning experience. Partnering with Fortune 500 companies, Skillsoft offers award-winning systems to support learning, performance, and success. If you are intrigued by this opportunity, Skillsoft welcomes your application.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Database Developer at our company, you will be responsible for designing, developing, and maintaining SQL Server databases. Your role will involve creating and managing Power BI reports and dashboards to provide actionable insights. Additionally, you will develop and maintain data pipelines using Azure Data Factory. Collaboration with cross-functional teams to understand data requirements and deliver solutions will be a key aspect of your job. You will be expected to optimize database performance, ensure data integrity, and troubleshoot and resolve data-related issues. Preparation of documentation for database applications and the development of best practices for database design and development activities will also be part of your responsibilities. Staying updated with the latest industry trends and technologies is essential, as well as completing any other duties as assigned. To be successful in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience with SQL Server, including writing complex queries and stored procedures, is required. Strong proficiency in Power BI, including DAX and Power Query, is necessary. Experience with Azure services, particularly Azure Data Factory and pipelines, is advantageous. Good working knowledge of reporting tools like Power BI, SSRS, etc., is also beneficial. Exposure to maintaining and supporting databases in Azure and on-premises is highly desirable. Knowledge of data warehousing concepts and ETL processes is a plus. Excellent problem-solving skills and attention to detail are crucial for this role. Strong communication and teamwork skills are essential, and you must be able to communicate effectively in English, both verbally and in writing. Additionally, while performing your duties, you may be required to sit, use hands to finger, handle, or feel, and reach with hands and arms. Specific vision abilities required include close vision, peripheral vision, depth perception, and the ability to adjust focus. AMETEK, Inc. is a leading global provider of industrial technology solutions, with annual sales over $7.0 billion. Committed to making a safer, sustainable, and more productive world a reality, we use differentiated technology solutions to solve our customers" most complex challenges. With 21,000 colleagues in 35 countries, we are grounded by our core values: Ethics and Integrity, Respect for the Individual, Inclusion, Teamwork, and Social Responsibility. AMETEK (NYSE:AME) is a component of the S&P 500. Visit www.ametek.com for more information.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

At PwC, our team in managed services focuses on providing outsourced solutions and support across various functions for our clients. By managing key processes and functions, we help organizations streamline their operations, reduce costs, and enhance efficiency. Our team is highly skilled in project management, technology, and process optimization to deliver top-notch services to our clients. In managed service management and strategy roles at PwC, your focus will be on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your responsibilities will include driving continuous improvement and optimizing managed services processes, tools, and services. As a member of our team, you are expected to be driven by curiosity and be a reliable contributor. In our dynamic environment, you will need to adapt to working with various clients and team members, each presenting unique challenges and opportunities for growth. Taking ownership and consistently delivering high-quality work that adds value for our clients and contributes to team success is crucial. Your journey at the Firm will involve building a strong personal brand that opens doors to further opportunities. To excel in this role, you should apply a learning mindset and be proactive in your development. Valuing diverse perspectives, understanding others" needs and feelings, and maintaining habits that support high performance are essential. Effective communication skills, active listening, seeking feedback, and the ability to analyze information from multiple sources are key for success. Building commercial awareness, adhering to professional and technical standards, and upholding the Firm's code of conduct are integral aspects of this role. Job Profile Name: Data Engineer Associate - Offshore Oracle Analytics Cloud Developer Job Title: Associate, Oracle Analytics Cloud Developer Location: Offshore Key Skills: Oracle Analytics Cloud (OAC), Spotfire, Data Visualization Summary: We are looking for an Associate Oracle Analytics Cloud Developer with experience in Spotfire to join our team. The role involves developing analytics solutions to drive business insights and improve decision-making processes. Minimum Degree Required: Bachelor's degree in computer science/IT, data science, or a relevant field Minimum Years of Experience: 1+ years in Oracle Analytics Cloud development Certifications Required: None Certifications Preferred: Certifications in Oracle Analytics Cloud, Spotfire, or related technologies are preferred Required Knowledge/Skills: - Basic experience in Oracle Analytics Cloud development - Familiarity with Spotfire for data visualization Key Responsibilities: - Assist in developing Oracle Analytics Cloud applications - Design and implement basic dashboards and reports - Collaborate with stakeholders to understand and address their analytics requirements Qualifications: - Basic skills in data visualization and analytics - Strong analytical and problem-solving abilities Preferred Skills: - Understanding of ETL processes and data integration - Exposure to additional BI tools What We Offer: - Competitive salary and benefits package - Opportunities for professional growth and development - A collaborative and innovative work environment,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are looking for a BI Project Manager to join your team in Hyderabad. As the Project Manager, you will be responsible for overseeing finance-related projects, ETL processes, and Power BI reporting. Your role will involve leading project lifecycles, collaborating with cross-functional teams, and translating business requirements into actionable insights. Your key responsibilities will include managing end-to-end project lifecycles for financial data and reporting initiatives, defining project scope and deliverables in collaboration with various teams, ensuring the accuracy of financial data through ETL workflows, designing and developing Power BI dashboards and reports, tracking project progress, mitigating risks, and delivering projects within budget and timelines. Additionally, you will be required to translate business requirements into technical specifications, conduct stakeholder meetings, and ensure data compliance and security protocols are followed. To qualify for this role, you should have a Bachelors or Masters degree in Finance, Computer Science, Information Systems, or a related field, along with at least 5 years of project management experience, preferably in the financial or banking sectors. You should possess a strong understanding of ETL processes, data pipelines, and data warehousing concepts, as well as hands-on experience with Power BI tools such as DAX, Power Query, data modeling, and report publishing. Your track record should demonstrate successful project deliveries within deadlines, and you should have excellent communication, problem-solving, and stakeholder management skills. Experience with Agile or Scrum methodologies will be an added advantage.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

About At Dawn Technologies At At Dawn Technologies, we are a niche company dedicated to providing specialized tech solutions in Gen AI, Data Engineering, and Backend Systems. We stand out by focusing on delivering innovative solutions that prioritize flexibility, independence, and real technological depth. Our core belief is that exceptional results are achieved through the combination of teamwork and talent. We are seeking individuals who are great problem-solvers and enjoy collaborating within a team to deliver significant outcomes. Job Overview As a Senior Solution Architect specialized in Database Technology at At Dawn Technologies, you will play a crucial role in architecting, designing, and implementing complex database solutions for our enterprise clients. You will utilize your expertise in both relational and non-relational database systems to develop scalable, secure, and high-performance solutions that align with our clients" business requirements. Responsibilities Solution Architecture & Design: - Lead the design and implementation of database solutions, focusing on scalability, performance, availability, and security. - Offer expert guidance on selecting and integrating database technologies based on client needs. - Design and deliver reliable, high-performance, and secure database architectures tailored to specific business use cases. - Develop detailed system architecture blueprints including data models, database schemas, and system integrations. Consultation & Collaboration: - Collaborate with business stakeholders, technical teams, and project managers to translate business requirements into technical solutions. - Provide consulting services to clients on database management best practices, optimization, and scaling. - Support pre-sales activities through technical presentations, scoping, and project estimation. Technology Leadership: - Stay updated on emerging database technologies and trends, recommending innovative solutions. - Mentor junior architects and developers, promoting the adoption of best practices in database design and implementation. - Lead troubleshooting and performance optimization efforts to ensure database systems meet defined service-level agreements (SLAs). Project & Vendor Management: - Oversee the successful implementation of database solutions across projects, aligning with architecture principles, timelines, and budgets. - Manage relationships with third-party database vendors and tools to ensure effective integration into the architecture. - Contribute to the development and management of the project roadmap for timely and cost-effective solution delivery. Security & Compliance: - Ensure database solutions adhere to industry standards and compliance regulations. - Implement data protection measures including encryption, backup strategies, and disaster recovery planning. - Conduct regular reviews of database security and performance. Required Skills & Experience Technical Expertise: - Proficiency in relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB). - Strong understanding of cloud-based database platforms (AWS RDS, Azure SQL Database, Google Cloud SQL). - Expertise in database design, optimization, and scaling for transactional and analytical workloads. - Familiarity with data warehousing, ETL processes, and data integration technologies. Architecture & Solution Design: - Proven experience in architecting large-scale, high-performance database systems. - Knowledge of data modeling, schema design, and database performance tuning. - Experience with cloud-native architectures and distributed database systems. Tools & Technologies: - Proficiency in database automation, monitoring, and management tools (e.g., Liquibase, Flyway, DBeaver, Prometheus). - Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). - Familiarity with DevOps pipelines for database deployments and CI/CD practices. Leadership & Communication: - Strong leadership and mentoring skills to guide cross-functional teams. - Excellent communication and presentation abilities to convey technical concepts to diverse stakeholders. - Capacity to manage multiple projects concurrently and effectively prioritize tasks. Certifications: - Oracle Certified Architect or relevant database certifications. - Cloud certifications (AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect Expert). Benefits & Perks For Working At At Dawn - Performance-based compensation. - Remote-first work environment. - Comprehensive healthcare benefits. - Generous paid time off and flexible leave policies. - Access to learning & development resources. - Culture centered on innovation, excellence, and growth. At At Dawn Technologies, we foster a workplace culture that values innovation, ownership, and excellence. We are committed to being an equal-opportunity employer that promotes diversity and inclusion. Join us in redefining the future!,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhereon-premises or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. As an Analytics Engineer at MongoDB, you will play a critical role in leveraging data to drive informed decision-making and simplify end-user engagement across our most critical data sets. You will be responsible for designing, developing, and maintaining robust analytics solutions, ensuring data integrity, and enabling data-driven insights across all of MongoDB. This role requires an analytical thinker with strong technical expertise to contribute to the growth and success of the entire business. We are looking to speak to candidates who are based in Gurugram for our hybrid working model. Responsibilities - Design, implement, and maintain highly performant data post-processing pipelines - Create shared data assets that will act as the company's source-of-truth for critical business metrics - Partner with analytics stakeholders to curate analysis-ready datasets and augment the generation of actionable insights - Partner with data engineering to expose governed datasets to the rest of the organization - Make impactful contributions to our analytics infrastructure, systems, and tools - Create and manage documentation, and conduct knowledge-sharing sessions to proliferate tribal knowledge and best practices - Maintain consistent planning and tracking of work in JIRA tickets Skills & Attributes - Bachelor's degree (or equivalent) in mathematics, computer science, information technology, engineering, or related discipline - 2-4 years of relevant experience - Strong Proficiency in SQL and experience working with relational databases - Solid understanding of data modeling and ETL processes - Proficiency in Python for data manipulation and analysis - Familiarity with CI/CD concepts and experience with managing codebases with git - Experience managing ETL and data pipeline orchestration with dbt and Airflow - Familiarity with basic command-line functions - Experience translating project requirements into a set of technical sub-tasks that build towards a final deliverable - Committed to continuous improvement, with a passion for building processes/tools to make everyone more efficient - The ability to effectively collaborate cross-functionally to drive actionable and measurable results - A passion for AI as an enhancing tool to improve workflows, increase productivity, and generate smarter outcomes - Strong communication skills to document technical processes clearly and lead knowledge-sharing efforts across teams - A desire to constantly learn and improve themselves To drive the personal growth and business impact of our employees, we're committed to developing a supportive and enriching culture for everyone. From employee affinity groups to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it's like to work at MongoDB and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

pune, maharashtra, india

Remote

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, we transform ideas into impact by bringing together data, science, technology and human ingenuity to deliver better outcomes for all. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning, bold ideas, courage and passion to drive life-changing impact to ZS. What you'll do: Solution Design and Architecture: Lead the design and architecture of Varicent solutions to meet client-specific business requirements.Develop comprehensive solution blueprints and technical specifications. Implementation and Configuration: Oversee the implementation and configuration of Varicent solutions, ensuring alignment with best practices and client needs.Customize and configure Varicent modules to support complex compensation plans and business rules. Stakeholder Engagement: Collaborate with clients to gather and analyze business requirements, translating them into effective Varicent solutions.Serve as the primary point of contact for clients, providing expert guidance and support throughout the project lifecycle. Integration and Data Management: Design and implement data integration strategies to ensure seamless data flow between Varicent and other enterprise systems.Manage data migration and ensure data accuracy and integrity. Testing and Quality Assurance: Develop and execute testing plans to validate solution functionality and performance.Identify and resolve any issues or discrepancies during the testing phase. Training and Support: Provide training and support to clients and internal teams on Varicent solutions.Develop training materials and documentation to facilitate knowledge transfer. What you'll bring: 5-6 years of experience working with Varicent, with a focus on solutions. Strong understanding of incentive compensation management processes and best practices. Proven experience in solution design, architecture, and implementation. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills, with the ability to engage effectively with clients and stakeholders. Experience with data integration and management, including ETL processes. Ability to work independently and as part of a collaborative team. Mentor junior team members and onboard them on Varicent platform Preferred Qualifications: Varicent certification or equivalent experience. Experience in multiple industries, such as insurance, finance, med-tech. Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systems-the ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are integral to your success here. We are committed to building a team that reflects a broad variety of backgrounds, perspectives, and experiences. about our inclusion and belonging efforts and the networks ZS supports to assist our ZSers in cultivating community spaces and obtaining the resources they need to thrive. If you're eager to grow, contribute, and bring your unique self to our work, we encourage you to apply. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To complete your application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At:

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Engineer at our company, you will be a valuable member of the team responsible for designing, implementing, and maintaining our data infrastructure and pipelines. Your role will involve collaborating with cross-functional teams to ensure data integrity and availability while meeting the data requirements of the business. You will be expected to translate business requirements into analytics solutions, design and maintain scalable data pipelines and ETL processes, and support and enhance enterprise data platforms. Working closely with data scientists, analysts, and software engineers, you will implement effective data solutions and processes to ensure data accuracy and consistency. Your responsibilities will also include optimizing data pipelines for performance and efficiency, translating business requirements into technical solutions, researching new technologies to enhance data processing capabilities, and ensuring that systems meet business requirements and industry practices within an Agile framework. The ideal candidate for this position should have a Bachelor's degree in computer science, Information Technology, or a related field, along with 5-7 years of proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL, Python, Pyspark, and Big data is required, as well as experience with data modeling, ETL processes, and data warehousing. Knowledge of relational SQL and NoSQL databases is also necessary. Preferred qualifications include knowledge of the pharma industry and experience with cloud computing services such as AWS and Azure. Please note that this position requires 2 days of work from office. If you do not meet every job requirement, we encourage you to apply anyway as our company is committed to fostering a diverse and inclusive workplace. Your excitement for the role and potential fit for the position are more important than meeting every qualification.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Power BI Visualization Engineer, you will be a valuable member of our analytics team, contributing your expertise in designing and building interactive Power BI dashboards and reports. Your primary responsibility will be to translate complex datasets into visually appealing insights that aid in business decision-making. Your proficiency in Power BI, coupled with secondary skills in Power Platform, Python, data modeling, and ETL processes, will be instrumental in creating impactful data visualization solutions. Your key responsibilities will include designing, developing, and deploying interactive Power BI dashboards and reports tailored to meet specific business requirements. You will collaborate with various teams to gather requirements, define key performance indicators (KPIs), and ensure that the data structures within Power BI are optimized. Writing and managing complex DAX formulas, creating relationships between data sources, and supporting ETL processes will also be crucial aspects of your role. Furthermore, you will leverage Power Platform tools such as Power Apps and Power Automate to enhance dashboard interactivity and automate workflows. Collaborating with Data Engineering to support large data volumes, using Python for data wrangling and custom logic embedding, and ensuring data quality, governance, and security compliance across all reporting solutions will be essential components of your work. To excel in this role, you must have proven experience in Power BI dashboard and data model development, a strong command of Power BI Desktop, Power BI Service, and DAX, as well as proficiency in data modeling concepts and creating relationships between data sources. Additionally, familiarity with Power Platform tools, ETL development using Power Query, SSIS, or Python, and hands-on experience with SQL and databases like SQL Server or PostgreSQL are required. Preferred qualifications include experience with Azure Data Services, machine learning integration into Power BI, knowledge of pipeline orchestration tools like the KEDRO framework, exposure to AI-driven analytics and predictive modeling, and relevant Microsoft certifications. If you are passionate about data visualization, analytics, and Power BI innovation, we encourage you to apply to join our high-impact team dedicated to driving data-led decisions.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be working as a full-time AWS Data Engineer specializing in Redshift & Databricks. This hybrid role is based in Bengaluru with the option for remote work. Your primary responsibilities will include tasks related to data engineering such as data modeling, ETL processes, data warehousing, and data analytics on AWS platforms like Redshift and Databricks. To excel in this role, you should possess strong skills in Data Engineering and Data Modeling. Experience with Extract Transform Load (ETL) processes, proficiency in Data Warehousing techniques, and Data Analytics capabilities are essential. You should have a knack for problem-solving and analytical thinking. Having prior experience with AWS services like Redshift and Databricks will be advantageous. A Bachelor's degree in Computer Science, Engineering, or a related field is required. Additionally, holding relevant certifications such as AWS Certified Data Analytics - Specialty will be considered a plus.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

At PwC, the team in data and analytics is dedicated to utilizing data effectively to derive insights and support informed business decisions. By applying advanced analytics techniques, the focus is on aiding clients in optimizing operations and attaining strategic objectives. As a data analyst at PwC, your role involves employing sophisticated analytical methods to extract valuable insights from extensive datasets, thereby facilitating data-driven decision-making. Your responsibilities include utilizing expertise in data manipulation, visualization, and statistical modeling to assist clients in resolving intricate business challenges. Your primary focus revolves around nurturing client relationships, fostering meaningful connections, and honing your leadership skills. As you navigate through complex scenarios, you are continuously enhancing your personal brand, expanding technical proficiency, and identifying your strengths. It is essential to anticipate the requirements of both your teams and clients while consistently delivering high-quality outcomes. Embracing ambiguity with ease, you demonstrate comfort in situations where the path forward may not be clear. You actively seek clarification, view uncertainties as learning opportunities, and use these moments to further develop yourself. To excel in this position, you are expected to possess a diverse set of skills, knowledge, and experiences. These may include, but are not limited to: - Demonstrating effective responsiveness to diverse perspectives, needs, and emotions of others. - Utilizing a wide array of tools, methodologies, and techniques to generate innovative ideas and solve complex problems. - Applying critical thinking to deconstruct intricate concepts effectively. - Understanding the broader objectives of your projects or roles and how your contributions align with the overall strategy. - Developing a comprehensive understanding of the evolving business landscape and its impact on operations. - Engaging in reflective practices to enhance self-awareness, reinforce strengths, and address areas for development. - Interpreting data to derive meaningful insights and formulate actionable recommendations. - Adhering to and upholding professional and technical standards, including specific guidelines related to PwC's tax and audit practices, the Firm's code of conduct, and independence requirements. For the role specific to Python, the ideal candidate should possess the following base skillset: - Proficiency in Python programming, with a solid foundation in object-oriented principles. - Competence in Python for application development, with optional knowledge in Artificial Intelligence and Machine Learning libraries. - Strong familiarity with Python data science libraries such as Pandas, Numpy, Plotly, and Matplotlib. - Hands-on experience in Python API programming using FastAPI, Flask, Django, or Graphene. - Proficiency in exception handling and unit testing within Python. - Competence in Docker and Kubernetes for containerization and orchestration. - Extensive expertise in ETL (Extract, Transform, Load) processes, encompassing data processing, manipulation, and bulk data transfers. - Strong background in relational database design, with proficiency in crafting advanced SQL queries and stored procedures, preferably with databases like Postgres, Oracle, or MySQL. - Demonstrated experience with CI/CD (Continuous Integration/Continuous Deployment) utilizing Azure DevOps or Jenkins. - Knowledge of Software Development Life Cycle (SDLC) and Agile methodologies. - Familiarity with version control tools such as Git. - Excellent communication and problem-solving skills. Additionally, the following skills are considered advantageous: - Experience with Azure/AWS platforms. - Proficiency in ReactJS or JavaScript. - Background in UI/UX design. - Familiarity with PowerBI/Tableau. These base skills can be applied in various functional areas/domains, including Banking and Risk and Regulatory functions (compliance testing, reconciliation & reporting). Furthermore, there are cross-deployment opportunities in Java-based projects and technology consulting.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As an Associate Oracle Analytics Cloud Developer at our company in Bangalore, you will be responsible for developing analytics solutions using Oracle Analytics Cloud (OAC) and Spotfire to drive business insights and improve decision-making processes. You should possess a Bachelor's degree in computer science/IT, data science, or a related field, along with at least 1+ years of experience in Oracle Analytics Cloud development. Your key responsibilities will include assisting in the development of Oracle Analytics Cloud applications, designing and implementing basic dashboards and reports, and collaborating with stakeholders to understand and fulfill their analytics requirements. You should have basic skills in data visualization and analytics, as well as strong analytical and problem-solving abilities. Preferred qualifications for this role include familiarity with ETL processes and data integration, along with exposure to other Business Intelligence (BI) tools. While certifications are not required, holding certifications in Oracle Analytics Cloud, Spotfire, or related technologies would be advantageous. In return, we offer a competitive salary and benefits package, opportunities for professional growth and development, and a collaborative and innovative work environment where you can thrive and contribute to the success of our team.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Lead Business Analyst, AVP at Deutsche Bank in Mumbai, India, you will play a pivotal role in designing and delivering critical senior management dashboards and analytics using tools like Tableau, Power BI, and more. Your primary responsibility will be to create management packs that facilitate timely decision-making for various businesses within the organization, thereby establishing a strong foundation for analytics. Collaboration with senior business managers, data engineers, and stakeholders from different teams is essential to comprehend requirements and transform them into visually appealing dashboards and reports. Your analytical skills will be key in deriving valuable insights from business data for strategic ad hoc exercises. In this role, you will collaborate closely with business users and managers to gather requirements, understand business needs, and devise optimal solutions. Conducting ad hoc data analysis as per business needs to generate reports, visualizations, and presentations that aid in strategic decision-making will be a core part of your responsibilities. Additionally, you will be tasked with sourcing information from various outlets, constructing a robust data pipeline model, and working with large and complex datasets to extract meaningful insights. Ensuring the integrity and accuracy of findings through audit checks and timely data refreshes for up-to-date information dissemination will be crucial aspects of your role. To excel in this position, you should hold a Bachelor's degree in computer science, IT, Business Administration, or a related field, coupled with a minimum of 5 years of experience in visual reporting development, including hands-on dashboard creation and working with intricate data sets. Proficiency in Tableau, Power BI, or similar BI tools, advanced Excel skills, data visualization best practices, data analysis, modeling, ETL processes, and SQL is required. Strong analytical, quantitative, problem-solving, and organizational skills are essential, along with attention to detail, multitasking abilities, prioritization skills, and meeting deadlines. Effective communication and writing skills will aid you in conveying project updates and recommendations effectively. At Deutsche Bank, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career growth. The company promotes a positive, fair, and inclusive work environment, welcoming applications from all individuals. Together, the teams at Deutsche Bank aim to excel collaboratively, act responsibly, think commercially, take initiative, and celebrate mutual successes. For further details about Deutsche Bank and its values, please visit our company website: https://www.db.com/company/company.htm,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You are an experienced Ab Initio Developer with 3 to 6 years of experience in Ab Initio and related technologies. Your expertise lies in data warehousing concepts, ETL processes, and data integration. With a minimum of 3-4 years of experience in Ab Initio, you are capable of working independently and possess good communication skills. You have a proven track record of working on complex and medium to large projects. Your primary skill set includes Ab Initio, proficiency in CoOp 3 and above, as well as EME concepts. You are well-versed in Ab Initio GDE, Continuous graphs, and Ab Initio plans. Proficiency in Ab Initio features and functionality is a must, with hands-on experience in cooperating versions 2.15 and 3.x. Additionally, you have good experience in Hadoop and are able to work on UNIX scripting of medium complexity. Your technical proficiency extends to working knowledge in Oracle and PLSQL. Scheduler knowledge, especially in control center, is also part of your skill set. You have experience in handling multiple technologies and complex projects, showcasing your multi-tasking capability. Working in multiple modules is a regular part of your role. (ref:hirist.tech),

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

As a highly skilled and experienced Data Modeler, you will be joining the Enterprise Data Modelling team where you will be responsible for creating and maintaining conceptual, logical, and physical data models. Your role will involve ensuring alignment with industry best practices and standards while working closely with business and functional teams. Your contribution as a Data Modeler will be essential in standardizing data models at portfolio and domain levels, thus driving efficiencies and maximizing the value of client's data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team, and ideal domain experience in Insurance or Investment Banking. Your key responsibilities will include developing comprehensive conceptual, logical, and physical data models for multiple domains within the organization by leveraging industry best practices and standards. Collaborating with business and functional teams to understand their data requirements and translating them into effective data models that support their strategic objectives will be crucial. Serving as a subject matter expert in data modeling tools such as ERwin Data Modeler, you will provide guidance and support to other team members and stakeholders. You will also establish and maintain standardized data models across portfolios and domains, ensuring consistency, governance, and alignment with organizational objectives. Identifying opportunities to optimize existing data models and enhance data utilization, particularly in critical areas such as fraud, banking, and AML, will be part of your role. Providing consulting services to internal groups on data modeling tool usage, administration, and issue resolution will promote seamless data flow and application connections. Developing and delivering training content and support materials for data models, ensuring that stakeholders have the necessary resources to understand and utilize them effectively, will also be a responsibility. Collaborating with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization, with a focus on long-term automated monitoring solutions, will be essential. Qualifications: - Bachelor's or master's degree in computer science, Information Systems, or a related field. - 10+ years of experience working as a Data Modeler or in a similar role, preferably within a large enterprise environment. - Expertise in data modeling concepts and methodologies, with demonstrated proficiency in creating conceptual, logical, and physical data models. - Hands-on experience with data modeling tools such as Erwin Data Modeler, as well as proficiency in database environments such as Snowflake and Netezza. - Strong analytical and problem-solving skills, with the ability to understand complex data requirements and translate them into effective data models. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Skills: problem-solving skills, business intelligence platforms, erwin, data modeling, database management systems, data warehousing, etl processes, big data technologies, agile methodologies, data governance, SQL, enterprise data modelling, data visualization tools, cloud data services, analytical skills, data modeling tool, data architecture, communication skills.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a highly skilled and experienced Senior Data Engineer at Magna, you will have the opportunity to lead our data engineering & analytics team. Your primary responsibility will be designing, implementing, and managing our data infrastructure and analytical systems. You will play a crucial role in ensuring efficient and accurate data flow using Databricks, data modeling, ETL processes, and database management. In this role, you will lead and manage a team of data engineers, providing technical guidance, mentoring, and support to drive the team's success. Collaborating with cross-functional teams, you will gather requirements, analyze data, and design effective solutions. Additionally, you will be hands-on in developing Python/Spark-based scripts and applications to support data processing and transformation. Your duties will also include optimizing and enhancing existing data infrastructure and systems for performance, scalability, and reliability. You will be involved in DevOps activities, such as deploying jobs and setting up required infrastructure. Staying updated with the latest industry trends and technologies in data engineering and analytics is essential to excel in this role. To be successful in this position, you should hold a degree in computer science or a related field and have 6-10 years of experience in Data Engineering & Analytics. A strong understanding of Data warehousing, engineering & ETL concepts, along with experience in Databricks development using Python, PySpark & DLT Pipelines is required. Proficiency in working with databases, various data storage types, and tools such as Databricks Notebooks, Azure Data Factory, and Change Data Capture (CDC) is expected. Excellent communication and interpersonal skills, time and project management abilities, goal orientation, and holistic thinking are essential qualities for this role. Experience with Azure/AWS and certifications in relevant data technologies will be advantageous. You should also possess team-leading, troubleshooting, analytical, client management, communication, and collaboration skills. Join Magna, a global mobility technology company, and be part of a team that is dedicated to innovating and advancing mobility in the transportation landscape. Your career path at Magna will be as unique as you are, with exciting responsibilities and development prospects awaiting you.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

As a Junior Developer in the Information Technology department at TMF Group, you will be responsible for a variety of tasks related to database management and application development. Your role will involve writing SQL queries, developing SQL databases, and performing CRUD operations. Additionally, you will work closely with other developers to optimize database systems for performance efficiency and contribute to database architecture and data warehousing. One of your key responsibilities will be implementing security measures to protect data, including managing user permissions and safeguarding sensitive information. You will also be involved in identifying, troubleshooting, and resolving database issues to ensure the smooth operation of database systems. Collaboration with team members and stakeholders is essential, as you will contribute to team efforts and support the development of their skills. Furthermore, you will provide technical assistance and training to end-users to promote effective utilization of Analytical tools like Qlik Sense, Power BI, and more. Collaboration with cross-functional teams will be necessary to align data analytics initiatives with business goals. Proficiency in SQL for data querying and manipulation is a key requirement for this role, along with a Bachelor's degree in computer science, Information Technology, Data Science, or a related field. Experience with programming languages such as Python and JavaScript is advantageous, along with familiarity with data extraction, transformation, and loading (ETL) processes. Strong problem-solving skills, attention to detail, effective communication skills, and the ability to work collaboratively in a team environment are essential for success in this position. Knowledge of SQL Server Reporting Services, SQL Server Analysis Services, Transparent Data Encryption (TDE), NoSQL\NewSQL databases, and handling larger excel files will be beneficial. At TMF Group, you will have the opportunity for career development, working on interesting and challenging projects with colleagues and clients worldwide. Internal career advancement opportunities within TMF Group and continuous development through global learning opportunities from the TMF Business Academy will be available to you. Additionally, you will be contributing to making the world a simpler place to do business for clients and making a difference in the communities where TMF Group operates through the corporate social responsibility program. The supportive environment at TMF Group includes a strong feedback culture, inclusive work environment allowing office and remote work flexibility, and various well-being initiatives. Other benefits such as Marriage Gift policy, Paternity & Adoption leaves, Interest-free loan policy, Salary advance policy, and more are provided to employees. TMF Group is excited to welcome individuals with the talent and potential to thrive in a global company that values diversity and offers opportunities to individuals from various backgrounds.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled Alteryx Designer responsible for developing, optimizing, and maintaining workflows and data transformation processes using Alteryx Designer. Your role involves collaborating with business stakeholders to gather requirements and deliver data-driven solutions. You will be optimizing workflows for performance and scalability, integrating Alteryx workflows with other BI tools for reporting and visualization, and automating data extraction, transformation, and loading processes to streamline business operations. Your responsibilities include designing, developing, and maintaining workflows using Alteryx Designer, ensuring adherence to best practices, and performing data quality checks and troubleshooting issues within Alteryx workflows. You will also provide documentation and training to end-users on Alteryx Designer workflows and solutions, stay updated on the latest features in Alteryx Designer, and recommend improvements to existing workflows. You should have proven experience with Alteryx Designer and developing complex workflows, a strong understanding of data analytics, ETL processes, and data visualization principles. Proficiency in integrating Alteryx workflows with BI tools like Tableau, Power BI, or QlikView is required. Excellent problem-solving and analytical skills with a strong attention to detail, knowledge of relational databases and SQL, strong communication skills, Alteryx Designer Core Certification (or higher), experience with Python or R, familiarity with cloud platforms for data integration, and knowledge of data governance and compliance standards are essential for this role. If you are interested in this opportunity, please share your resume at namratha.katke@neutrinotechlabs.com.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

As an Associate Engineer II at ArcelorMittal Global Business and Technologies in India, you will play a crucial role in handling front end SAP BW projects. Your primary responsibility will be to manage the full lifecycle of solutions, from gathering business requirements to developing, testing, and supporting robust BI applications. You will collaborate closely with business stakeholders to comprehend their analytical needs, convert them into technical specifications, and create effective BW solutions. Your role will require a unique combination of business acumen, analytical skills, and technical expertise in SAP BW/PowerBI/Tableau development. To qualify for this position, you should hold a Bachelor's degree in Information Technology, Computer Science, or a related field, along with 1-2 years of experience in BW development. A strong grasp of SAP BW and Azure concepts, architecture, and data modeling principles is essential, as well as experience in developing SAP BW solutions, including ETL processes. Additionally, you should possess expertise in query design and reporting tools such as BEx Query Designer and Analysis for Office. Strong analytical and problem-solving skills are crucial for translating business requirements into technical solutions effectively. Effective communication and interpersonal skills are also necessary for collaborating with business stakeholders and technical teams. Preferred qualifications include experience with Power BI and Tableau, as well as MS, Tableau, or SAP certification. Familiarity with SAP BW/PowerBI/Tableau security concepts, user authorization, SAP HANA, and BW/4HANA is advantageous. In return, ArcelorMittal Global Business and Technologies offer a key technical role in enterprise data infrastructure, a competitive salary, benefits, continuous learning opportunities, access to modern SAP and Azure environments, and cross-functional collaboration for skill development. Join us to be a part of a thriving community where innovative ideas are nurtured, and sustainable business growth is encouraged.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

bhopal, madhya pradesh

On-site

We are looking for a skilled Database Developer with expertise in MongoDB, SQL, and Database design to join our team. As a Database Developer, you will be responsible for designing, implementing, and optimizing MongoDB databases for performance and scalability. You will play a crucial role in creating and managing relational (SQL) and NoSQL (MongoDB) schemas, writing and optimizing complex queries, and migrating data between different systems. Your contribution will be essential in maintaining database integrity, performance, and security to support our data-driven applications. Key Responsibilities - Design, implement, and optimize MongoDB databases for performance and scalability. - Create and manage relational (SQL) and NoSQL (MongoDB) schemas. - Write and optimize complex queries in both MongoDB and SQL databases. - Migrate data between SQL and NoSQL systems as required. - Ensure database integrity, performance, and security. - Collaborate with development, DevOps, and analytics teams to support data-driven applications. - Monitor, troubleshoot, and resolve database performance issues. - Document database structures, processes, and best practices. Required Skills - Deep understanding of MongoDB features such as collections, indexes, aggregation pipelines, and performance tuning. - Strong command of SQL syntax and query optimization (e.g., PostgreSQL, MySQL). - Experience in designing normalized (relational) and denormalized (NoSQL) database models. - Ability to design and manage data migration and ETL processes. - Proficiency in version control using Git and collaborative development practices. - Strong analytical skills, problem-solving abilities, and attention to detail. Preferred Qualifications - Experience with MongoDB Atlas and cloud-based deployments. - Knowledge of scripting or programming languages like Python, JavaScript/Node.js. - Understanding of database security, access control, and compliance. - Familiarity with Agile, CI/CD, or DevOps environments. If you are a motivated Database Developer with a passion for building and maintaining robust database systems, we encourage you to apply for this exciting opportunity to contribute to our team's success.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an Analytics and Power BI Specialist, you will play a crucial role in our team by leveraging your expertise in data analysis and visualization to drive business decisions. Your primary responsibility will involve collecting, cleaning, and analyzing large datasets to uncover valuable insights that support our business objectives. You will also be tasked with developing interactive Power BI reports and dashboards that effectively communicate key metrics and performance indicators to stakeholders. Collaboration with various departments is essential in this role, as you will work closely with them to understand their data needs and translate these requirements into technical solutions. Ensuring data integrity and consistency across reports and dashboards through effective data modeling will be a key aspect of your responsibilities. Additionally, you will be responsible for monitoring and optimizing Power BI reports for performance and usability, as well as providing training and support to end-users on how to effectively utilize Power BI tools. To excel in this position, you should possess a Bachelor's degree in data science, Computer Science, Statistics, Business Analytics, or a related field. Relevant certifications in Power BI or data analytics would be advantageous. A minimum of 5 years of experience in data analysis and reporting is required, along with proven expertise in Power BI, including report creation, data modeling, and dashboard development. Proficiency in Power BI tools such as Power Query, DAX, and Power BI Service is essential, along with a strong understanding of data visualization best practices and techniques. Strong problem-solving skills, excellent communication abilities, attention to detail, and the capacity to work collaboratively in a team environment are key attributes that will contribute to your success in this role. Preferred skills include experience with human resource management, data warehousing concepts, ETL processes, advanced analytics techniques like predictive modeling or machine learning, and familiarity with programming languages such as Python or R for data analysis. Your role as an Analytics and Power BI Specialist will be instrumental in driving our business forward through data-driven insights and effective visualization techniques.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies