Jobs
Interviews

1055 Etl Processes Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer, you will be responsible for building and maintaining data pipelines using Snowflake, SQL, and Python to establish a robust data platform for our team. Your role will involve collaborating with data analysts and data scientists to ensure data accuracy and security while optimizing data systems for performance. Your key responsibilities will include constructing data pipelines and ETL processes, working with Snowflake for data storage and optimization, writing and enhancing SQL queries, leveraging Python for data tasks, and monitoring data systems to uphold their efficiency. Your expertise in Snowflake, advanced SQL skills, proficiency in Python for data engineering, experience with ETL processes, and understanding of data modeling will be essential for this role. To qualify for this position, you must hold a Bachelor's degree in Computer Science or a related field, possess over 6 years of data engineering experience, demonstrate strong problem-solving and communication skills, exhibit the ability to collaborate effectively within a team, and ideally have experience in AI/ML. Additionally, being Snow pro certified Professional would be a plus. If you are passionate about data engineering and possess the required skills and qualifications, we invite you to join our team and contribute to the development of our data platform.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

KLA is a global leader in diversified electronics for the semiconductor manufacturing ecosystem. Virtually every electronic device in the world is produced using our technologies. No laptop, smartphone, wearable device, voice-controlled gadget, flexible screen, VR device or smart car would have made it into your hands without us. KLA invents systems and solutions for the manufacturing of wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. The innovative ideas and devices that are advancing humanity all begin with inspiration, research and development. KLA focuses more than average on innovation and we invest 15% of sales back into R&D. Our expert teams of physicists, engineers, data scientists and problem-solvers work together with the worlds leading technology providers to accelerate the delivery of tomorrows electronic devices. Life here is exciting and our teams thrive on tackling really hard problems. There is never a dull moment with us. The Information Technology (IT) group at KLA is involved in every aspect of the global business. ITs mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence. As a Sr. data engineer, part of Data Sciences and Analytics team, you will play a key role for KLAs Data strategy principles and technique. As part of the centralized analytics team we will help analyze and find key data insights into various business unit processes across the company. You will be providing key performance indicators, dashboards to help various business users/partners to make business critical decisions. You will craft and develop analytical solutions by capturing business requirements and translating them into technical specifications, building data models and data visualizations. Responsibilities: - Design, develop and deploy Microsoft Fabric solutions, Power BI reports and dashboards. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop data models and establish data connections to various data sources. Use expert knowledge on Microsoft fabric architecture, deployment, and management. - Optimize Power BI solutions for performance and scalability. - Implement best practices for data visualization and user experience. - Conduct code reviews and provide mentorship to junior developers. - Manage permissions and workspaces in Power BI ensuring secure and efficient analytics platform. - Conduct assessments and audits of existing Microsoft Fabric environments to identify areas for improvement. - Stay current with the latest Fabric and Power BI features and updates. - Troubleshoot and resolve issues related to Fabric objects, Power BI reports and data sources. - Create detailed documentation, including design specifications, implementation plans, and user guides. Minimum Qualifications: - Doctorate (Academic) Degree and 0 years related work experience; Master's Level Degree and related work experience of 3 years; Bachelor's Level Degree and related work experience of 5 years - Proven experience as a Power BI Developer, with a strong portfolio of Power BI projects. - In-depth knowledge of Power BI, including DAX, Power Query, and data modeling. - Experience with SQL and other data manipulation languages. - In-depth knowledge of Microsoft Fabric and Power BI, including its components and capabilities. - Strong understanding of Azure cloud computing, data integration, and data management. - Excellent problem-solving skills and the ability to work independently and as part of a team. - Excellent Technical Problem Solving skill, performance optimization skills - Specialist in SQL and Stored procedures with Data warehouse concepts - Performed ETL Processes (Extract, Load, Transform). - Exceptional communication and interpersonal skills. - Expert knowledge of cloud and big data concepts and tools. Azure, AWS, Data Lake, Snowflake, etc. Nice to have: - Extremely strong SQL skills - Foundational knowledge of Metadata management, Master Data Management, Data Governance, Data Analytics - Good to have technical knowledge on Data Bricks/Data Lake/Spark/SQL. - Experience in configuring SSO(Single Single-On), RBAC, security roles on an analytics platform. - SAP functional knowledge is a plus - Microsoft certifications related to Microsoft Fabric/Power BI or Azure/analytics are a plus. - Good understanding of requirements and converting them into data warehouse solutions We offer a competitive, family friendly total rewards package. We design our programs to reflect our commitment to an inclusive environment, while ensuring we provide benefits that meet the diverse needs of our employees. KLA is proud to be an equal opportunity employer Be aware of potentially fraudulent job postings or suspicious recruiting activity by persons that are currently posing as KLA employees. KLA never asks for any financial compensation to be considered for an interview, to become an employee, or for equipment. Further, KLA does not work with any recruiters or third parties who charge such fees either directly or on behalf of KLA. Please ensure that you have searched KLAs Careers website for legitimate job postings. KLA follows a recruiting process that involves multiple interviews in person or on video conferencing with our hiring managers. If you are concerned that a communication, an interview, an offer of employment, or that an employee is not legitimate, please send an email to talent.acquisition@kla.com to confirm the person you are communicating with is an employee. We take your privacy very seriously and confidentially handle your information.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. We are seeking a talented and driven Power BI Developer to join our team. The ideal candidate will be responsible for collecting, analysing, and interpreting complex data sets to drive informed business decisions. You will work closely and directly with the Client and cross-functional teams to identify trends, patterns, and insights that will contribute to our company's growth. In this role, you will play a key role in developing, designing, and maintaining Power BI dashboards and reports to provide actionable insights. You will collaborate with business stakeholders to understand their data requirements and translate them into technical specifications. Additionally, you will implement data models, data transformations, and data visualizations using Power BI. The ideal candidate should have a minimum of 5 years of experience in Power BI Development. You will be required to automate data extraction, transformation, and loading (ETL) processes to ensure efficient data flow. Moreover, you will integrate Power BI with other data sources and systems to create comprehensive reporting solutions. You will also be responsible for optimizing Power BI performance and troubleshooting issues as they arise, ensuring data accuracy, consistency, and security in all reports and dashboards. At Capgemini, you will receive comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work arrangements. We are committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. You will have the opportunity to work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world. With over 55 years of heritage, Capgemini is trusted by its clients to unlock the value of technology and address the entire breadth of their business needs. The company delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, cloud, and data, combined with deep industry expertise and a strong partner ecosystem.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are invited to join ValueLabs as a SAS ADMIN with experience in SAS Viya, based in Bengaluru. With a minimum of 5 years of relevant experience, you will be responsible for the administration of SAS 9.4 and SAS Viya, ensuring smooth operations on Linux/Unix and Windows OS. Your expertise should extend to SAS Grid Manager, SAS Studio, SAS Enterprise Guide, and SAS Visual Analytics. Familiarity with cloud platforms such as AWS, Azure, or GCP is crucial for SAS Viya deployments. Your role will involve hands-on implementation and configuration of SAS solutions, along with a good understanding of data integration, ETL processes, and database connectivity. You will work closely with stakeholders to gather requirements and translate them into technical solutions. Deployment and configuration of SAS applications, including integration with other systems, will be part of your responsibilities. Monitoring SAS servers, logs, and processes to identify and proactively resolve issues will be key. Troubleshooting performance bottlenecks, errors, and system failures, followed by root cause analysis and preventive measures implementation, are crucial aspects of the role. Ensuring security and compliance of SAS environments, including implementing encryption, authentication, and authorization mechanisms, will be required. Regular audits, security patch applications, and adherence to organizational security policies are essential. You will be expected to create and maintain detailed documentation for SAS environments, configurations, and processes. Providing training and support to end-users and team members, developing best practices, and standard operating procedures for SAS administration will also fall under your responsibilities. If you are ready to take on this challenging role, apply now and be a part of the ValueLabs team.,

Posted 1 month ago

Apply

7.0 - 10.0 years

4 - 8 Lacs

Pune, Maharashtra, India

On-site

Data profiling to identify primary keys and issues with the data. ETL to bring data onto the Cambia Data Platform, de-duplicate data, create or update dimensional data structures, and produce use case-specific output. Unit testing, functional testing, and performance testing and tuning. Interacting with the Product team to understand and refine requirements. Interacting with QA to address reported findings. Working individually and as a team to achieve our goals. Taking initiative to take on additional work if the present work stream slows down Other similar or related activities. Top 3-5 REQUIREMENTS (you don t want to see candidates without these) Expert level knowledge of Git CLI and managing git-based repositories Previous CI/CD experience in working with either Gitlab Runners, Github Actions, Circle CI, or Jenkins and configuring them into GitLab repositories Intermediate to expert knowledge of Snowflake related technologies Intermediate experience in developing and managing python code and python based web services Top 3-5 Desirements

Posted 1 month ago

Apply

3.0 - 10.0 years

4 - 23 Lacs

Gurgaon, Haryana, India

On-site

Description We are looking for a skilled Data Scientist to join our team in India. The ideal candidate will have 3-10 years of experience in data analysis, machine learning, and statistical modeling, with a passion for turning data into actionable insights. Responsibilities Analyze large datasets to extract meaningful insights and trends. Develop predictive models and machine learning algorithms to improve business outcomes. Collaborate with cross-functional teams to understand data requirements and provide data-driven solutions. Visualize data findings using tools like Tableau or Power BI for effective communication to stakeholders. Stay updated with the latest industry trends and advancements in data science and analytics. Skills and Qualifications Proficiency in programming languages such as Python or R. Experience with data manipulation tools and libraries such as Pandas, NumPy, or SQL. Strong understanding of machine learning algorithms and frameworks such as Scikit-learn or TensorFlow. Familiarity with data visualization tools like Matplotlib, Seaborn, or Tableau. Excellent problem-solving skills and ability to work with complex datasets.

Posted 1 month ago

Apply

3.0 - 6.0 years

2 - 20 Lacs

Pune, Maharashtra, India

On-site

Job description Step into role of a Data Platform Engineer Lead At Barclays, innovation isnt encouraged, its expected As a Data Platform Engineer Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Data Platform Engineer Lead, you should have experience with: Hands-on coding experience on Java or Python Strong knowledge and hands-on experience of AWS development including some of the following: Lambda, Glue, Step Functions, IAM roles, Lakeformation, EventBridge, SNS, SQS, EC2, Security Groups, CloudFormation, RDS, DynamoDB, Redshift Experience in building efficient data pipelines using Apache Spark and AWS services Strong technical acumen with the ability to quickly understand complex systems, troubleshoot issues and apply sound engineering principles to solve problems A proactive learner who stays up to date with new technologies and continuously seeks to improve their skills and understanding Design experience across multiple programmes where individual has technically lead the development BigData / Data Warehouse experience should have worked in Financial services domain Additional Relevant Skills Given Below Are Highly Valued Proven ability to design and develop enterprise level software solutions using tools and techniques such as Source Control, Build Tools (e g Maven), TDD, Jenkins etc Demonstrable working knowledge on different file formats JSON, Iceberg, Avro Knowledge of Streaming services is preferable (Kafka, MSK, Kinesis, Glue Streaming etc) Communication and Collaboration: Ability to communicate effectively with cross-functional teams and stakeholders Documenting configurations, processes, and best practices for the team Experience of mentoring team members You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave.

Posted 1 month ago

Apply

2.0 - 6.0 years

2 - 20 Lacs

Pune, Maharashtra, India

On-site

Job description Join us as Senior Abinitio Developer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences To be successful as a Senior Abinitio Developer you should have experience with: Ab-Initio Unix SQL Hadoop / Bigdata Some Other Highly Valued Skills May Include Design AWS You may be assessed on key critical skills relevant for success in role, such as risk and , change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based out of Pune Barclays is required by law to confirm that you have the Legal Right to Work in any role that you apply for If you currently hold a work visa sponsored by Barclays, or you would require sponsorship from Barclays, you declare this as part of your application Sponsored visas are role and entity specific and any changes be reviewed It is important that you ensure you are working on the correct visa at all times Failure to accurately disclose your visa status or Legal Right to Work may result in your application or any employment offer being withdrawn at any time Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Analyst Expectations Will have an impact on the work of related teams within the area Partner with other functions and business areas Takes responsibility for end results of a teams operational processing and activities Escalate breaches of policies / procedure appropriately Take responsibility for embedding new policies/ procedures adopted due to risk mitigation Advise and influence decision making within own area of expertise Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Make evaluative judgements based on the analysis of factual information, paying attention to detail Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents Guide and persuade team members and communicate complex / sensitive information Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave.

Posted 1 month ago

Apply

7.0 - 9.0 years

7 - 9 Lacs

Mumbai, Maharashtra, India

On-site

Alteryx Development: Solid, deep experience with data extract, transform and load (ETL) tools as Alteryx Design, develop, and maintain complex Alteryx workflows for data integration, preparation, and analysis Implement best practices in workflow design, optimization, and documentation Create reusable modules and macros to improve efficiency Troubleshoot and resolve issues in existing workflows Perform end to end Data validation Conduct unit tests and develop database queries to analyze the effects and troubleshoot any issues that arise Write Complex SQL queries on multiple tables using complex joins and extract data from DB Team Management and Project Management : Oversee multiple projects on Alteryx/Power BI/Python/VBA simultaneously. Participate in the full project life cycle including technical analysis, design, data analysis and mapping, development and testing. Communicate complex topics to team through both written and oral communications Mentor junior developers and provide technical guidance. Assist in task allocation and workload management within the team. Conduct code reviews and ensure adherence to coding standards. Manage stakeholder expectations and communicate project status regularly Develop impactful presentations and documents for reporting Technical Governance: Stay updated with the latest Alteryx features and industry best practices. Recommend and implement process improvements and new methodologies. Contribute to the development of internal Alteryx standards and guidelines. Assist in evaluating new tools and technologies that complement Alteryx. Stakeholder Interaction: Collaborate with business units to gather and clarify requirements. Present technical solutions and project updates to stakeholders Provide training and support to end-users on digital solutions. Data Governance and Quality: Ensure compliance with data governance policies in Alteryx workflows Implement data quality checks and validation processes Ensure adherence to functional architecture strategy and compliance to company development standards. Assist in maintaining data security and privacy standards Performance Optimization: Identify and resolve performance bottlenecks in Alteryx workflows Implement strategies for efficient processing of large datasets Documentation and Knowledge Sharing: Create and maintain comprehensive documentation for Alteryx workflows Contribute to the teams knowledge base and best practices repository Assist in developing training materials for team members and end-users. Participate and contribute in various digital forums Required Qualification Bachelors degree in Computer Science, Data Science, or related field Alteryx Designer Core or Advanced Certification 7+ years of experience with Alteryx, including advanced workflow development Strong understanding of data integration, ETL processes, and analytics Excellent problem-solving and analytical skills Good communication and interpersonal skills Experience with SQL and at least one programming language (e.g., Python, R) Familiarity with data visualization tools (e.g., Tableau, Power BI) Experience in managing 2 to 3 team members Good to have knowledge of machine learning and predictive analytics Essential Strong communication skills and analytics Self-Starter and able to self-manage Ability to prepare accurate reports for all levels of staff in a language and tone appropriate to the audience. Good team player, ability to work on a local, regional and global basis Able to perform under pressure. Ability to drive to succeed and go beyond the extra mile Other programming experience Financial services industry experience General knowledge/skills: Databases, Excel, PowerPoint

Posted 1 month ago

Apply

10.0 - 12.0 years

4 - 8 Lacs

Kolkata, West Bengal, India

On-site

Job Description Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Data Architecture Principles Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : any graduate The Data Management Practitioner role you described falls under the Data Management or Data Governance function. Key Responsibilities: Designing Data Strategies : Developing and implementing strategies that ensure data integrity and compliance, while optimizing its usage across the organization. Data Quality & Governance : Designing data quality rules, setting up compliance policies, and enforcing governance frameworks to ensure that data is accurate, secure, and used optimally. Team Leadership : As an SME, managing teams, contributing to decision-making processes, and ensuring that all practices align with organizational goals and regulations. Professional & Technical Skills: Must Have Skills : Proficiency in Data Architecture Principles , ensuring that data is structured, organized, and governed effectively. Deep understanding of Data Management Best Practices , including how to implement them across different systems within the organization. Experience with Data Governance and Compliance Policies , critical for ensuring that data management adheres to regulatory standards. Additional Skills : Ability to optimize data usage to derive value for the organization by ensuring the right access, quality, and utility of data across departments. Required Experience: Minimum of 12 years of experience in Data Architecture Principles , showing that this role requires an experienced individual with a deep understanding of how to manage large sets of data and integrate them into business operations. Educational Requirements : A graduate degree in any field, indicating flexibility regarding educational background.

Posted 1 month ago

Apply

8.0 - 9.0 years

4 - 8 Lacs

Pune, Maharashtra, India

On-site

Responsibilities: Understand business requirements in the BI context and design data models to transform raw data into meaningful insights Create dashboards and interactive visual reports using Power BI Identify key performance indicators (KPIs) with clear objectives and consistently monitor them Analyze data and present it through reports that support decision-making Convert business requirements into technical specifications and set timelines for delivery Create relationships between data and develop tabular/multidimensional models Document charts, algorithms, parameters, models, and relationships Demonstrate proficiency in Analysis Services building Tabular & Multidimensional models (OLAP, Cubes) over DW/DM/DB Write complex DAX and MDX queries Design and guide BI-related IT architecture in existing or new landscapes, ensuring compliance with standards Provide blueprinting, gather requirements, and roll out BI solutions to end-users Design conceptual, logical, and physical data models Design, develop, test, and deploy Power BI scripts with advanced analytics Analyze current ETL processes and define improvements Contribute to data warehouse development using SSAS, SSIS, and SSRS Redefine and implement strategic improvements to current BI systems Create custom charts and calculations based on user requirements Design and deploy business intelligence solutions that meet evolving organizational needs Use SQL queries, filters, and graphs to enhance data comprehension Collaborate with users and teams at all levels to drive continuous improvement Required Skills: Bachelor s degree in Computer Science, Business Administration, or a related field Minimum of 6 years of experience in visual reporting and dashboard development At least 6 years of hands-on Power BI development experience Strong expertise in SQL Server Excellent proficiency in Microsoft Office, especially Excel Strong analytical, problem-solving, and organizational skills High attention to detail with the ability to manage multiple tasks and meet deadlines If you re ready to lead impactful BI solutions and work with a talented team, we d love to hear from you!

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be working as a Genio OpenText ETL Developer at Birlasoft, a global leader in Cloud, AI, and Digital technologies. Birlasoft seamlessly combines domain expertise with enterprise solutions to enhance the efficiency and productivity of businesses worldwide. This role, based in India, requires you to have extensive experience in Extract, Transform, Load (ETL) processes using OpenText Genio. Your responsibilities will include designing, developing, and maintaining ETL workflows to support data integration and migration projects. Your key responsibilities will involve designing, developing, and maintaining ETL processes using OpenText Genio. You will collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications. Implementing data extraction, transformation, and loading processes to integrate data from various sources will be a crucial part of your role. You will also optimize ETL workflows for performance and scalability, perform data quality checks, ensure data integrity throughout the ETL process, troubleshoot and resolve ETL-related issues, document ETL processes, and provide support and guidance to junior ETL developers. To qualify for this position, you should hold a Bachelors degree in Computer Science, Information Technology, or a related field. You must have proven experience as an ETL Developer, with a focus on OpenText Genio. Strong knowledge of ETL concepts, data integration, and data warehousing is essential. Proficiency in SQL, experience with database management systems, familiarity with data modeling and data mapping techniques, excellent problem-solving skills, attention to detail, and strong communication and teamwork abilities are required. Preferred qualifications for this role include experience with other ETL tools and technologies and knowledge of Agile development methodologies.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Description: As a Data Modeler at PwC, you will play a crucial role in analyzing business needs, developing long-term data models, and ensuring the efficiency and consistency of data systems. Your expertise in data modeling, metadata management, and data system optimization will contribute to enhancing the overall performance of our data infrastructure. Key responsibilities include: - Analyzing and translating business needs into comprehensive data models. - Evaluating existing data systems and recommending improvements for optimization. - Defining rules to efficiently translate and transform data across various data models. - Collaborating with the development team to create conceptual data models and data flows. - Developing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility and efficiency. - Implementing data strategies and developing physical data models to meet business requirements. - Utilizing canonical data modeling techniques to enhance the efficiency of data systems. - Evaluating implemented data systems for variances, discrepancies, and optimal performance. - Troubleshooting and optimizing data systems to ensure seamless operation. Key expertise required: - Strong proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, PowerDesigner. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. - Familiarity with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Experience with ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 3 to 5 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, PowerDesigner, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Power BI Architect with over 10 years of experience, you will be responsible for leveraging your expertise in Power BI to design and implement efficient and scalable solutions. Your key skills and experience will include: - Having a minimum of 5 years of hands-on experience working with Power BI, showcasing your proficiency in this tool. - Demonstrating your expertise as a Power BI Architect, with a deep understanding of Power BI Desktop, Power Query, DAX, and Power BI Service. - Possessing a strong grasp of data warehousing principles, ETL processes, and relational databases such as SQL Server and Azure SQL. - Familiarity with cloud platforms like Azure Synapse, Azure Data Factory, or similar technologies will be an added advantage. - Solid knowledge of data governance, security, and compliance best practices to ensure data integrity and confidentiality. - Utilizing your excellent problem-solving abilities, effective communication skills, and strong leadership qualities to drive successful outcomes. - Holding a Bachelors degree in computer science, Information Systems, or a related field to support your technical expertise. - Exposure to the Finance domain will be beneficial in understanding industry-specific requirements and challenges. In this role, you will play a crucial part in designing and implementing Power BI solutions that align with business objectives and drive data-driven decision-making processes. Your contributions will be instrumental in enhancing data visualization, analytics, and reporting capabilities within the organization.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As the heart of our business is driven by technology, we attribute our success to our global and diverse culture. At Quantiphi, we cherish our people and take pride in fostering a culture that thrives on transparency, diversity, integrity, learning, and growth. If the idea of working in an environment that not only encourages you to innovate and excel professionally but also supports your personal growth interests you, then a career with Quantiphi is the right fit for you! Responsibilities: - Creating and maintaining LookML code to define data models, dimensions, measures, and relationships within Looker. - Developing reusable LookML components to ensure consistency and efficiency in report and dashboard creation. - Building and customizing dashboards to incorporate data visualizations like charts and graphs to effectively present insights. - Writing complex SQL queries when necessary to extract and manipulate data from underlying databases and optimizing SQL queries for performance. - Identifying and addressing bottlenecks affecting report and dashboard loading times, and optimizing Looker performance by tuning queries, caching strategies, and exploring indexing options. - Configuring user roles and permissions within Looker to control access to sensitive data and implementing data security best practices, including row-level and field-level security. - Demonstrating a good understanding of Looker API, SDK, and extension framework. - Using version control systems like Git to manage LookML code changes and collaborating with other developers. - Providing training and support to business users to help them effectively navigate and use Looker. - Diagnosing and resolving technical issues related to Looker, data models, and reports. Skills Required: - Experience in Looker modeling language, LookML, including defining data models, dimensions, and measures. - Strong SQL skills for writing and optimizing database queries. - Understanding of different SQL database systems and dialects (GCP/BQ preferable). - Knowledge of data modeling best practices. - Proficiency in ETL processes for data transformation and preparation. - Skill in creating visually appealing and effective data visualizations using Looker dashboard and reporting tools. - Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing options. - Familiarity with related tools and technologies such as data warehousing (e.g., BigQuery), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python). - Strong analytical and problem-solving skills. - Knowledge of data governance principles and practices to ensure data quality, privacy, and compliance within Looker. - Willingness to stay updated on Looker's latest features and best practices in data analytics and BI. - Creating and maintaining efficient data models. - Fine-tuning queries and optimizing the overall performance of Looker dashboards. - Providing training and support to end-users, helping them understand how to use Looker effectively for data analysis and decision-making. - Excellent understanding of advanced Looker concepts - liquid, data security, complex derived tables, caching / PDTs, etc. - Troubleshooting issues related to data modeling, queries, and dashboards, identifying root causes, and implementing effective solutions to resolve them. If you are passionate about wild growth and enjoy working with happy, enthusiastic over-achievers, your career at Quantiphi promises to be a fulfilling journey!,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity As a Senior BI Consultant, you will be responsible for supporting and enhancing Business Intelligence and Data Analytics platforms with a primary focus on Power BI and Databricks. You will work across global engagements, helping clients translate complex data into actionable insights. This role involves day-to-day application management, dashboard development, troubleshooting, and stakeholder collaboration to ensure high data quality, performance, and availability. Your Key Responsibilities BI Support & Monitoring: Provide daily application support for Power BI dashboards and Databricks pipelines, resolving incidents, fulfilling service requests, and implementing enhancements. Dashboard Development: Design, develop, and maintain Power BI reports and data models tailored to evolving business requirements. Root Cause Analysis: Investigate and resolve data/reporting issues, bugs, and performance bottlenecks through detailed root cause analysis. Requirement Gathering: Collaborate with business users and technical stakeholders to define BI requirements and translate them into scalable solutions. Documentation: Maintain technical documentation, including data flows, dashboard usage guides, and QA test scripts. On-Call & Shift Support: Participate in shift rotations and be available for on-call support for critical business scenarios. Integration & Data Modeling: Ensure effective data integration from diverse systems and maintain clean, performant data models within Power BI and Databricks. Skills and attributes for success Hands-on expertise in Power BI, including DAX, data modeling, and report optimization Working experience in Databricks, especially with Delta Lake, SQL, and PySpark for data transformation Familiarity with ETL/ELT design, especially within Azure data ecosystems Ability to troubleshoot BI performance issues and manage service tickets efficiently Strong communication skills to interact with global stakeholders and cross-functional teams Ability to manage and prioritize multiple support tasks in a fast-paced environment To qualify for the role, you must have 3-7 years of experience in Business Intelligence and Application Support Strong hands-on skills in Power BI and Databricks, preferably in a global delivery model Working knowledge of ETL processes, data validation, and performance tuning Familiarity with ITSM practices for service request, incident, and change management Willingness to work in rotational shifts and support on-call requirements Bachelor's degree in Computer Science, Engineering, or equivalent work experience Willingness to work in a 24x7 rotational shift-based support environment. No location constraints Technologies and Tools Must haves Power BI: Expertise in report design, data modeling, and DAX Databricks: Experience with notebooks, Delta Lake, SQL, and PySpark Azure Ecosystem: Familiarity with Azure Data Lake and Azure Synapse (consumer layer) ETL & Data Modelling: Good understanding of data integration and modeling best practices ITSM Tools: Experience with ServiceNow or equivalent for ticketing and change management Good to have Data Integration: Experience integrating with ERP, CRM, or POS systems Python: For data transformation and automation scripting Monitoring: Awareness of Azure Monitor or Log Analytics for pipeline health Certifications: Microsoft Certified Data Analyst Associate or Databricks Certified Data Engineer Associate Industry Exposure: Experience in retail or consumer goods industries What we look for People with client orientation, experience and enthusiasm to learn new things in this fast-moving environment. An opportunity to be a part of a market-leading, multi-disciplinary team of hundreds of professionals. Opportunities to work with EY BI application maintenance, practices globally with leading businesses across a range of industries. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations - Argentina, China, India, the Philippines, Poland and the UK - and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We'll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You'll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We'll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We'll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You'll be embraced for who you are and empowered to use your voice to help others find theirs. About EY EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. If you can demonstrate that you meet the criteria above, please contact us as soon as possible. The exceptional EY experience. It's yours to build. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a Senior Python Data Application Developer with a strong expertise in core Python and data-focused libraries. Your primary responsibility is to design, develop, and maintain data-driven applications optimized for performance and scalability. You will be building robust data pipelines, ETL processes, and APIs for integrating various data sources efficiently within the cloud environment. In this role, you will work on AWS using serverless and microservices architectures, utilizing services such as AWS Lambda, API Gateway, S3, DynamoDB, Kinesis, and other AWS tools as required. Collaboration with cross-functional teams is essential to deliver feature-rich applications that meet business requirements. You will apply software design principles and best practices to ensure applications are maintainable, modular, and highly testable. Your tasks will also involve setting up monitoring solutions to proactively monitor application performance, detect anomalies, and resolve issues. Optimizing data applications for cost, performance, and reliability on AWS is a crucial aspect of your role. To excel in this position, you should have at least 5 years of professional experience in data-focused application development using Python. Proficiency in core Python and data libraries such as Pandas, NumPy, and PySpark is required. You must possess a strong understanding of AWS services like ECS, Lambda, API Gateway, S3, DynamoDB, Kinesis, etc. Experience with building highly distributed and scalable solutions via serverless, micro-service, and service-oriented architecture is essential. Furthermore, you should be familiar with unit test frameworks, code quality tools, and CI/CD practices. Knowledge of database management, ORM concepts, and experience with both relational (PostgreSQL, MySQL) and NoSQL (DynamoDB) databases is desired. An understanding of the end-to-end software development lifecycle, Agile methodology, and AWS certification would be advantageous. Strong problem-solving abilities, attention to detail, critical thinking, and excellent communication skills are necessary for effective collaboration with technical and non-technical teams. Mentoring junior developers and contributing to a collaborative team environment are also part of your responsibilities. This is a full-time position located in Bangalore with a hybrid work schedule. If you have proficiency in Pandas, NumPy, and PySpark, along with 5 years of experience in Python, we encourage you to apply and join our team dedicated to developing, optimizing, and deploying scalable data applications supporting company growth and innovation.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for fetching and transforming data from various systems, conducting in-depth analyses to identify gaps, opportunities, and insights, and providing recommendations that support strategic business decisions. Your key responsibilities will include data extraction and transformation, data analysis and insight generation, visualization and reporting, collaboration with cross-functional teams, and building strong working relationships with external stakeholders. You will report to the VP Business Growth and work closely with clients. To excel in this role, you should have proficiency in SQL for data querying and Python for data manipulation and transformation. Experience with data engineering tools such as Spark and Kafka, as well as orchestration tools like Apache NiFi and Apache Airflow, will be essential for ETL processes and workflow automation. Expertise in data visualization tools such as Tableau and Power BI, along with strong analytical skills including statistical techniques, will be crucial. In addition to technical skills, you should possess soft skills such as flexibility, excellent communication skills, business acumen, and the ability to work independently as well as within a team. Your academic qualifications should include a Bachelors or Masters degree in Applied Mathematics, Management Science, Data Science, Statistics, Econometrics, or Engineering. Extensive experience in Data Lake architecture, building data pipelines using AWS services, proficiency in Python and SQL, and experience in the banking domain will be advantageous. Overall, you should demonstrate high motivation, a good work ethic, maturity, personal initiative, and strong oral and written communication skills to succeed in this role.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have at least 5+ years of Quality Assurance experience, along with a minimum of 4+ years of ETL Processes/Data Warehouse Testing experience. Additionally, a minimum of 1+ years of experience in Python (Pandas) is required. Your role will involve hands-on experience in ORACLE database technologies, including writing complex ORACLE SQL and PL/SQL scripts for data validation and backend Data warehouse testing. Experience in ETL Processes (Data Warehouse Testing) and BI testing (QlikView, Cognos, etc.) will be essential, and you should be proficient enough to Analyze Source Systems, Staging area, Fact, and Dimension tables in Target D/W. Moreover, familiarity with No-SQL DB is a plus, and you should be proficient in defect tracking/Quality assurance tools such as ALM, JIRA, with knowledge of qTest being an advantage. Strong technical skills, good time management, and the ability to work both in a team and individually are key requirements. You should also have experience with QA processes and deliverables in an Agile/SCRUM environment, as well as strong verbal and written communication skills to effectively communicate with varying audiences. Proficiency in MS Office (Word, Excel, Outlook, PowerPoint) and prior experience in the financial domain would be beneficial. Furthermore, you should be well-versed with all stages of the Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC). Your roles and responsibilities will include reviewing requirements and specifications, defining test conditions, designing test cases and test scripts, providing estimates for testing activities, and preparing Test Strategy, Test Plan, Detailed Test Cases, and writing Test Scripts. You will be responsible for performing backend (database) testing using complex SQL queries in Oracle, completing regression and integration testing, analyzing test results, logging defects in JIRA, and collaborating with project team members on issue resolution and re-testing. You will also be required to analyze and report test activities and results, as well as document, maintain, track, and report test status using tools such as ALM, JIRA, or qTest.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

The Senior Semantic Modeler will be responsible for designing, developing, and maintaining Semantic models using platforms like CubeDev, HoneyDew, AtScale, and others. This role requires a deep understanding of Semantic modeling principles and practices. You will work closely with data architects, data engineers, and business stakeholders to ensure the accurate and efficient representation of data for Generative AI and Business Intelligence purposes. Experience with graph-based semantic models is a plus. As a Product Architect - Semantic Modelling, your key responsibilities will include: - Designing and developing Semantic data models using platforms such as CubeDev, HoneyDew, AtScale, etc. - Creating and maintaining Semantic layers that accurately represent business concepts and support complex querying and reporting. - Collaborating with stakeholders to understand data requirements and translating them into semantic models. - Integrating semantic models with existing Gen AI & BI infrastructure alongside data architects and engineers. - Ensuring the alignment of semantic models with business needs and data governance policies. - Defining key business metrics within the semantic models for consistent and accurate reporting. - Identifying and documenting metric definitions in collaboration with business stakeholders. - Implementing processes for metric validation and verification to ensure accuracy and reliability. - Monitoring and maintaining the performance of metrics within the Semantic models and addressing any issues promptly. - Developing efficient queries and scripts for data retrieval and analysis. - Conducting regular reviews and updates of semantic models to ensure their effectiveness. - Providing guidance and expertise on Semantic technologies and best practices to the development team. - Performing data quality assessments and implementing improvements for data integrity and consistency. - Staying up to date with the latest trends in Semantic technologies and incorporating relevant innovations into the modeling process. - Secondary responsibilities may include designing and developing graph-based semantic models using RDF, OWL, and other semantic web standards. - Creating and maintaining ontologies that accurately represent domain knowledge and business concepts. Requirements: - Bachelor's or Masters degree in Computer Science, Information Systems, Data Science, or a related field. - Minimum of 6+ years of experience in Semantic modeling, data modeling, or related roles. - Proficiency in Semantic modeling platforms such as CubeDev, HoneyDew, AtScale, etc. - Strong understanding of data integration and ETL processes. - Familiarity with data governance and data quality principles. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Experience with graph-based semantic modeling tools such as Protg, Jena, or similar is a plus. Functional skills: - Experience in Lifesciences commercial analytics industry is preferred with familiarity in industry-specific data standards. - Knowledge of Gen AI overview and frameworks would be a plus. - Certification in BI semantic modeling or related technologies. Trinity is a life science consulting firm, founded in 1996, committed to providing evidence-based solutions for life science corporations globally. With over 25 years of experience, Trinity is dedicated to solving clients" most challenging problems through exceptional service, powerful tools, and data-driven insights. Trinity has 12 offices globally, serving 270+ life sciences customers with 1200+ employees. The India office was established in 2017 and currently has around 350+ employees, with plans for exponential growth. Qualifications: B.E Graduates are preferred.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and optimizing data models within the Celonis Execution Management System (EMS). Your duties will include extracting, transforming, and loading (ETL) data from flat files and UDP into Celonis. It is essential to work closely with business stakeholders and data analysts to understand data requirements and ensure an accurate representation of business processes. Additionally, you will be required to develop and optimize PQL (Process Query Language) queries for process mining. Collaboration with group data engineers, architects, and analysts is crucial to ensure high-quality data pipelines and scalable solutions. Data validation, cleansing, and transformation will also be part of your responsibilities to enhance data quality. Monitoring and troubleshooting data integration pipelines to ensure performance and reliability are key tasks. You will also provide guidance and best practices for data modeling in Celonis. To qualify for this role, you should have a minimum of 5 years of experience in data engineering, data modeling, or related roles. Proficiency in SQL, ETL processes, and database management (e.g., PostgreSQL, Snowflake, BigQuery, or similar) is required. Experience working with large-scale datasets and optimizing data models for performance is essential. Your data management experience must cover the data lifecycle and critical functions such as data profiling, data modeling, data engineering, and data consumption products and services. Strong problem-solving skills are necessary, along with the ability to work in an agile, fast-paced environment. Excellent communication skills and demonstrated hands-on experience in communicating technical topics with non-technical audiences are expected. You should be able to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. Excellent collaboration skills to work with cross-functional teams will also be essential for this role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer, you will play a crucial role in designing, implementing, and maintaining robust data pipelines on the Databricks platform. Your primary responsibility will be to ensure the smooth flow of data by collaborating with data scientists, analysts, and stakeholders to guarantee data availability and integrity for analytics and reporting purposes. Your key responsibilities will include designing and developing data pipelines using Databricks and Apache Spark to create scalable and efficient solutions. You will be responsible for integrating data from various sources such as databases, APIs, and external datasets, focusing on maintaining data quality and consistency throughout the process. In addition, you will be tasked with developing and optimizing ETL processes to support data analytics and business intelligence needs. Performance optimization will be a critical aspect of your role, where you will work on tuning Spark jobs and managing resource allocation to ensure optimal data processing efficiency. Collaboration will be key in your role as you will closely work with data scientists, analysts, and stakeholders to understand data requirements and provide effective solutions. Implementing data validation and cleansing procedures will be essential to guarantee data accuracy and reliability. Documentation plays a vital role in maintaining transparency and knowledge sharing within the team. Therefore, you will be responsible for documenting data pipelines, processes, and architecture to facilitate maintenance and enhance knowledge sharing among team members. Monitoring and troubleshooting data pipelines will also be part of your responsibilities. You will be required to monitor pipeline performance and reliability, as well as troubleshoot any issues that may arise to ensure the smooth functioning of the data infrastructure.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

delhi

On-site

You are invited to join our team at ThoughtSol Infotech Pvt.Ltd as a skilled and experienced Power BI Developer. If you have a passion for data visualization, analytics, and business intelligence, along with a strong background in developing and implementing Power BI solutions, we are looking for you. Your responsibilities will include developing and maintaining Power BI dashboards, reports, and data visualizations to meet business requirements. You will design and implement data models, ETL processes, and data integration solutions using Power BI and related technologies. Collaborating with business stakeholders to gather requirements, understand data needs, and deliver actionable insights will be a key part of your role. It will also be essential to optimize performance and usability of Power BI solutions through data modeling, query optimization, and UI/UX enhancements. Implementing data governance and security best practices to ensure data accuracy, integrity, and confidentiality is another crucial aspect of the position. Additionally, you will provide training and support to end users on Power BI usage, best practices, and troubleshooting. Staying updated on the latest Power BI features, trends, and best practices will be necessary to recommend improvements to existing solutions. To qualify for this role, you should have a minimum of 2 years of experience and hold a Bachelor's degree in Computer Science, Information Systems, or a related field. Proficiency in Power BI Desktop, Power Query, DAX, and Power BI Service is required. A strong understanding of data warehousing concepts, data modeling techniques, and ETL processes is essential. Experience with SQL, T-SQL, and relational databases (e.g., SQL Server, MySQL, PostgreSQL) is also necessary. Familiarity with Azure services (e.g., Azure SQL Database, Azure Data Lake, Azure Analysis Services) is considered a plus. Excellent analytical, problem-solving, and communication skills are a must, along with the ability to work independently and collaboratively in a fast-paced environment.,

Posted 1 month ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Inviting applications for the role of Lead Consultant, Salesforce Developer- Data Cloud In this role you will be responsible for designing, developing, and implementing solutions using Data Cloud and Salesforce OMS primarily. Salesforce Developer with a strong understanding of Agentforce and hands on development experience of Data Cloud. Responsibilities: Data Cloud Development (Essential): . Design and implement data pipelines to ingest, transform, and load data into Data Cloud. . Develop data models and data flows to enable advanced analytics and insights. . Create data visualizations and dashboards to communicate data-driven insights. . Integrate machine learning and AI models into Data Cloud to enhance data analysis and prediction. Agentforce Development (good to have): . Design, develop, and deploy Agentforce agents to automate tasks and improve customer service efficiency. . Experience in writing complex and multi-step Prompt Builder steps and flows . Implement complex Agentforce orchestration flows to automate multi-step processes. . Integrate Agentforce with Data Cloud, OMS, Service Cloud, Experience Cloud and other relevant systems both internal Salesforce systems and external platforms . Train and support users on Agentforce best practices. . Optimize Agentforce agents for performance and scalability. . Monitor Agentforce agents for errors and performance issues and implement corrective actions. Minimum Qualification: . Bachelor%27s or Master%27s degree in Computer Science, Information Technology, or a related field. . Proven experience in cloud data engineering or a similar role. . Strong knowledge of cloud platforms such as AWS, Azure, or Google Cloud. . Proficiency in programming languages such as Python, Java, or Scala. . Experience with data modeling, ETL processes, and data warehousing. . Excellent problem-solving skills and attention to detail. . Strong communication and collaboration skills.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Inviting applications for the role of Lead Consultant-Data Engineer, Azure+Python ! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies