Jobs
Interviews

1052 Etl Processes Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

The ideal candidate for this position should have proven experience working as a Power BI Developer or in a similar role. You should be well-versed in data modeling, ETL processes, and data warehousing. A good understanding of SQL and other data querying languages is also required. In terms of technical skills, you should be proficient in Microsoft Power BI, including Power Query and DAX. A strong grasp of database management systems (DBMS) and data warehousing concepts is essential. Any experience with other BI tools such as Tableau or Qlik would be considered a plus.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

Job Description: As a BI Developer with 3-8 years of experience, you will work on both offshore and onsite client projects. Your responsibilities will include Oracle BI Applications/ OBIEE/OAC/OAS Implementation, interacting with clients to gather requirements, and being accountable for technical design, development, and system/integration testing following Oracle methodologies. You should hold a BE, B-Tech, or MCA qualification and have a minimum of 3-8 years of Business Intelligence project experience, including at least one complete lifecycle implementation of BI Applications/ OBIEE/OAC/ OAS. Moreover, you should possess expertise in end-to-end OBIEE/OAC/OAS and Oracle BI Applications implementation, proficiency in OBIEE/OAC RPD design and report design, strong PL/SQL skills, and a good understanding of Oracle database and development. Additionally, qualities like creativity, personal drive, influencing and negotiating, problem-solving, building effective relationships, customer focus, and effective communication are highly valued. This full-time position is based in Bangalore, Pune, or Hyderabad with onsite/hybrid work options, and the start date is immediate. If you meet the requirements and are interested, please share your updated resume at emily@netsach.co.in and post in netsachglobal.com. Thank you, Emily Jha Netsach - A Cyber Security Company www.netsachglobal.com,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a highly motivated and skilled Data Engineer to join our dynamic team. In this high-impact role, you will play a key part in constructing and maintaining our data infrastructure to facilitate data-driven decision-making throughout the organization. We welcome remote candidates who are team-oriented, adaptable, and eager to acquire new knowledge. Your responsibilities will include designing, developing, and managing scalable ETL pipelines utilizing Python and DigDag. You will extensively collaborate with Google Cloud services, particularly BigQuery, for data warehousing and analytics. Crafting and optimizing intricate SQL queries for data extraction and transformation will be a crucial part of your role. You will work closely with data scientists, analysts, and fellow engineers to comprehend data requirements and provide robust data solutions. Ensuring data quality, integrity, and security across all data pipelines and systems will be a top priority. You will troubleshoot and resolve data-related issues with minimal disruption to data availability and continuously explore and implement new technologies and best practices in data engineering. Furthermore, contributing to the overall data strategy and architecture will be a part of your role. To qualify for this position, you should have strong experience in data engineering with a proven history of constructing and deploying data pipelines. Proficiency in Python for data manipulation and automation is essential, along with experience in workflow management tools like DigDag. In-depth knowledge of ETL processes, data warehousing concepts, and extensive experience with Google Cloud Platform (GCP), particularly BigQuery, are required. Expertise in SQL for data querying and manipulation is a must. The ability to work both independently and collaboratively within a team is vital. Strong problem-solving and analytical skills, as well as excellent communication and interpersonal abilities, are desired qualities. Preferred qualifications include familiarity with other GCP data services such as Cloud Storage, Cloud Pub/Sub, Dataflow, experience with data visualization tools, and an understanding of data governance and data security principles. This is a full-time position with a day shift and morning shift schedule. Work location is in person.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

At Systech Solutions, Inc., a data and analytics firm specializing in cloud, data engineering, AI, and machine learning solutions, you will have the opportunity to leverage your expertise as a Senior Data Engineer - DBT & Snowflake. With over 30 years of industry experience and a global footprint, we focus on empowering businesses to address intricate challenges through the seamless integration of data, business strategy, and technology. Our commitment to a technology-agnostic approach and strategic partnerships with industry leaders enables us to craft tailored solutions across a range of sectors including finance, retail, manufacturing, healthcare, and insurance. By prioritizing efficiency and cost-effectiveness, we have maintained an exceptional customer retention rate exceeding 90% and have been acknowledged by Inc. Magazine for our ability to convert challenges into growth prospects. In this full-time, hybrid role based in Chennai, with the option for some remote work, you will play a pivotal role in designing, implementing, and overseeing data transformation processes utilizing DBT and Snowflake technologies. As a Senior Data Engineer, your responsibilities will encompass the development of data models, execution of ETL processes, and construction of data warehouses. Additionally, you will be tasked with conducting data analytics to address business requirements and collaborating with diverse teams to ensure data integrity and optimize performance. To excel in this role, you should possess a solid background in Data Engineering, Data Modeling, and ETL processes, along with proficiency in Data Warehousing and Data Analytics. A deep understanding of DBT (Data Build Tool) and Snowflake is essential, coupled with exceptional problem-solving abilities and effective communication skills. The role will require you to work autonomously and adapt to a hybrid work environment. Experience in managing extensive datasets and streamlining data processes will be advantageous. A Bachelor's degree in Computer Science, Information Technology, or a related field is required, while prior exposure to industries such as finance, retail, manufacturing, healthcare, or insurance will be beneficial.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are looking for a BI Senior Consultant to join the EY GDS Team. As part of our EY GDS TechOps team, you will be responsible for providing expert-level business intelligence support with a strong focus on Power BI and Databricks. You will work across various regions for our global clients, helping to design, develop, and maintain insightful and scalable data solutions. You will collaborate closely with cross-functional teams to understand business needs, transform data into actionable insights, and continuously improve reporting solutions to meet dynamic business requirements. This is a fantastic opportunity to be part of a leading firm while playing a key role in its growth. You will work with a high-quality team to support global clients ensuring data-driven decision-making through best-in-class analytics, automation, and innovation, all within an international and collaborative environment. To qualify for the role, you must have a Bachelor's degree in a related technology field (Computer Science, Engineering, Data Analytics, etc.) or equivalent work experience. You should have 3-7 years of hands-on experience in Business Intelligence, with strong proficiency in Power BI and Databricks; experience working with global clients is preferred. Proven experience in building and supporting BI dashboards, data models, and reports, ensuring minimal disruption to business operations is required. You should have the ability to analyze user-reported issues, identify root causes, and deliver effective solutions in a timely manner. Experience collaborating with stakeholders to understand reporting needs, gather requirements, and develop scalable data solutions is essential. A strong grasp of ETL processes, data modeling, and data visualization best practices is necessary. Ability to interpret business needs and translate them into technical solutions that enhance efficiency and data-driven decision-making is crucial. Excellent cross-functional communication skills and experience working in offshore/onshore delivery models are a must. You should have the ability to troubleshoot and resolve data discrepancies, report errors, and performance issues related to BI tools. Being a self-starter with the ability to work independently in fast-paced, time-critical environments is important. Flexibility in managing work hours due to the volatile nature of Application Management work including the ability to do shifts and being on call for critical business requirements is required. Ideally, youll also have experience working with cloud-based data platforms such as Azure especially in data engineering and analytics contexts. Strong knowledge of data integration from various sources (e.g., CRM, ERP, POS systems, web analytics), with experience in building robust ETL/ELT pipelines, is beneficial. Proficiency in Databricks, including the use of Delta Lake, SQL, and PySpark for data transformation and processing is a plus. Familiarity with integrating Power BI dashboards with diverse data sources, including cloud storage, data warehouses, and APIs, is an advantage. Experience working in or supporting clients in retail or consumer goods industries is a plus. Certifications such as Microsoft Certified: Data Analyst Associate, Databricks Certified Data Engineer Associate, or similar credentials are a strong advantage. As a BI Senior Consultant, your responsibilities will include providing day-to-day Application Management support for Business Intelligence and Data Analytics solutions, including handling service requests, incident resolution, enhancements, change management, and problem management. You will lead and coordinate root cause analysis for data/reporting issues, bugs, and performance bottlenecks, implementing corrective actions and improvements as needed. Collaborating with business users and technical stakeholders to gather requirements, understand data needs, and provide advice on Power BI dashboards, data models, and Databricks pipelines will be part of your role. You will develop and maintain comprehensive documentation, including data flow diagrams, dashboard usage guides, and test cases/scripts for quality assurance. Flexibility in managing work hours due to the volatile nature of Application Management work including the ability to do shifts and being on call for critical business requirements is essential. We are looking for individuals with client orientation, experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be a part of a market-leading, multi-disciplinary team of hundreds of professionals. Opportunities to work with EY BI application maintenance, practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer support, coaching, and feedback from engaging colleagues opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. About EY: As a global leader in assurance, tax, transaction, and advisory services, we're using the finance products, expertise, and systems we've developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities, and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we'll make our ambition to be the best employer by 2025 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now! EY | Building a better working world. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: Work location Pune You will work with A multi-disciplinary squad, crafting, developing, and operating Palantir Foundry based applications for production & operations business. Our teams use Palantir Foundry and various data engineering technologies for data pipeline, data modelling, build and operate business critical data-driven solutions. You will use AI/ML and LLMs to drive business efficiency. Let me tell you about the role As an enterprise engineer with experience of Palantir Foundry and data engineering technology, you will work with engineers that bring diverse set experiences in developing and maintaining applications across production and operations. What you will deliver Build and Maintain applications on Palantir Foundry platform. Develop and optimize data pipelines and workflows. Perform data integration, analysis, and visualization tasks; ensure data quality and integrity. Identify and solve issues within the Palantir Foundry environment. Collaborate with multi-functional teams to deliver data-driven solutions; provide technical support and training to team members. What you will need to be successful (experience and qualifications) Technical Skills Bachelors degree in Computer Science, Engineering, Computer Information Systems, with prior experience in software and platform engineering. 3+ years of hands-on Palantir Foundry experience - with understanding of Ontology, Code Repositories, Pipeline Builder, Workshop, Quiver, Contour Experience with data integration and ETL processes. Knowledge of scripting and programming languages such as Python, Spark, Scala and SQL. Awareness of software engineering practices & standard methodologies for full SDLC, including coding standards, code reviews, source control management, continuous deployments, testing, and operations Collaboration skills; should be able to engage and influence others to collect requirements, describe what youre doing, work through problems, and find productive solutions. Inter personal skills for partnering with customers and senior leadership. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Additional Information We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

Are you ready to experiment, learn, and implement Then, this is your place to be. Join us on a new adventure where your expertise in SQL Server, SSIS, ETL processes, and data integration can change the dynamics of our organization and revolutionize the paradigm. We are waiting for you because we stand for selection, not rejection. OptiSol is your answer to a stress-free, balanced lifestyle. We are your home away from home, where the career is nurtured at both ends. Being certified as a GREAT PLACE TO WORK for 4 consecutive years, we are known for our culture and believe in open communication and accessible leadership. We celebrate diversity, promote work-life balance with flexible policies, where you can thrive personally and professionally. Core Competencies required for this role include proficiency in SQL Server, ETL-SSIS, data migration, client handling, team handling, and excellent communication skills. Solid experience with SQL Server and ETL-SSIS is essential for making data moves easy. Additionally, experience in data migration, handling clients, leading teams, and strong communication skills are crucial for keeping everyone on the same page. Key responsibilities include analyzing and profiling data from both source and target systems, designing strategies for data mapping, cleansing, transformation, and validation, implementing ETL processes for data migration using SSIS, and collaborating with tech teams to integrate data into SQL Server efficiently. It is also important to develop and apply strategies to validate data, monitor data quality KPIs, and communicate progress clearly to stakeholders. Applicants are expected to have strong SQL Server knowledge, experience with Oracle databases is a plus, proficiency in SSIS, and familiarity with Power BI. Additionally, experience in data analysis, profiling, and quality assurance is desirable, along with strong collaboration skills to work effectively with cross-functional teams and stakeholders. Join us at OptiSol and explore why we are the perfect fit for each other. Experience life at OptiSol and discover more about our culture on OptiSol's Insta Page.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

As a member of the risk and compliance team at PwC, your primary focus will be on maintaining regulatory compliance and managing risks for clients. You will provide valuable advice and solutions to help organizations navigate complex regulatory landscapes and enhance their internal controls effectively. In the realm of enterprise risk management, your role will involve identifying and mitigating potential risks that could impact an organization's operations and objectives. You will play a crucial part in developing business strategies to manage and navigate risks in today's rapidly changing business environment. Joining PwC Acceleration Centers (ACs) presents a unique opportunity to actively support various services, including Advisory, Assurance, Tax, and Business Services. Within our innovative hubs, you will engage in challenging projects and deliver distinctive services to enhance client engagements through quality and innovation. Moreover, you will participate in dynamic training programs designed to enhance your technical and professional skills. As a part of the Enterprise Risk Management team, your responsibilities will include designing and implementing data-driven solutions to enhance decision-making processes. In the role of a Senior Associate, you will be tasked with developing interactive dashboards, creating data models, and collaborating with cross-functional teams to drive strategic initiatives and improve organizational performance. Key Responsibilities: - Design and implement data-driven solutions to support decision-making - Develop interactive dashboards for visualizing key insights - Enhance data models to improve performance and usability - Collaborate with cross-functional teams to align on strategic initiatives - Analyze data to derive insights that enhance organizational performance - Utilize various tools and methodologies to solve complex problems - Ensure the accuracy and integrity of data used in analyses - Maintain a focus on continuous improvement in data processes Requirements: - Bachelor's Degree - 3 years of relevant experience - Proficiency in oral and written English Desired Skills: - Proficiency in Power BI development and data visualization - Experience in building and maintaining semantic data models - Familiarity with data integration and ETL processes - Effective collaboration with cross-functional teams - Clear communication of status updates and test results - Proficiency in SQL for data management and transformation - Engagement in Agile methodologies and ceremonies This role presents an exciting opportunity to contribute to risk management and decision-making processes while enhancing organizational performance through data-driven solutions and strategic initiatives.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be responsible for maintaining ETL processes to guarantee data accuracy, completeness, and integrity during extraction, transformation, and loading stages. Your tasks will include utilizing scripting knowledge to automate these processes efficiently. Proficiency in tools such as SQL, Python/Java, ETL testing frameworks, Tableau, or Google Looker will be necessary for this role. Your role will involve developing and automating test scripts for ETL workflows and data pipelines to ensure comprehensive database testing coverage. Additionally, you will be expected to identify quality issues and collaborate with ETL developers and data engineers to understand data requirements effectively.,

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

The Database Administrator (DBA) will manage and optimize databases for Project Quasars RAN Cutover, ensuring reliable storage and retrieval of data during OSS system migrations. This role supports data integrity and performance for market-by-market cutovers. Responsibilities Manage and optimize databases for OSS data migration. Ensure data integrity and performance during cutover processes. Implement database backups, recovery, and security measures. Collaborate with data engineers to support ETL pipelines. Monitor and tune database performance. Document database configurations and processes. Qualifications Bachelors degree in Computer Science, IT, or a related field. 4+ years of experience as a Database Administrator. Expertise in database systems (e.g., SQL Server, Oracle, Snowflake). Experience with ETL processes and data migrations. Strong problem-solving skills. Must be located in India and eligible to work. Preferred Skills Experience in telecommunications or OSS systems. Knowledge of Snowflake or Databricks. Familiarity with cloud-based databases. Certifications in database administration (e.g., Oracle DBA, Microsoft SQL Server). Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Data Engineer Company Overview: Capgemini Engineering is a forward-thinking organization dedicated to leveraging data-driven insights to propel our business strategies. We are currently seeking a motivated Junior Data Engineer with proficiency in Python and a keen interest in working with Cognite Data Fusion to join our dynamic team. Position Overview: As a Junior Data Engineer, you will be instrumental in designing, developing, and maintaining scalable data pipelines and infrastructure. Your primary focus will be on integrating and optimizing data workflows using Cognite Data Fusion and Python, thereby facilitating seamless data accessibility and analysis across the organization. Key Responsibilities: Data Pipeline Development: Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Data Integration: Utilize Cognite Data Fusion to automate and scale the contextualization of various data sources, ensuring efficient data integration and accessibility. Programming: Develop robust ETL processes and data workflows using Python, ensuring code quality, scalability, and maintainability.? Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver effective solutions.? Data Quality Assurance: Implement data validation and quality checks to ensure accuracy, consistency, and reliability of data.? Documentation: Maintain comprehensive documentation of data processes, workflows, and system architectures.? Bachelors degree in Computer Science, Information Technology, Engineering, or a related field.? Proficiency in Python programming language.? Familiarity with data integration platforms Understanding of data modelling, database design, and data warehousing concepts.? Experience with SQL and working with relational databases.? Basic knowledge of cloud platforms such as AWS or Azure is a plus.? Strong problem-solving skills and attention to detail.? Excellent communication and collaboration abilities.? Preferred Qualifications: Experience with data pipeline and workflow management tools.? Knowledge of big data technologies and frameworks. Familiarity with data visualization tools and techniques (Grafana is a plus). Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Tecnotree Corporation, headquartered in Finland and founded in 1978, is the leading provider of full-stack Digital BSS for CSPs and DSPs. Tecnotree supports customers in monetizing and transforming their businesses towards a marketplace of digital services, allowing people to self-serve, engage, and manage their digital lives. Tecnotree is listed on Nasdaq Helsinki (TEM1V). For more information, please visit www.tecnotree.com or contact us at [HIDDEN TEXT]. Role Description This is a full-time hybrid role, based in Bengaluru with some work-from-home flexibility, for a Data Engineer. The Data Engineer will handle day-to-day tasks such as data engineering, data modeling, ETL processes, data warehousing, and data analytics. The individual will also develop and maintain data pipelines and work closely with other teams to ensure data integrity and accuracy. Qualifications Proficiency in Data Engineering and Data Modeling Experience with Extract Transform Load (ETL) processes and Data Warehousing Strong skills in Data Analytics Excellent problem-solving and analytical abilities Effective communication and teamwork skills Bachelor&aposs degree in Computer Science, Engineering, or a related field Prior experience in a similar role is advantageous Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

2 - 7 Lacs

Hyderabad, Telangana, India

On-site

Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation. Basic Qualifications and Experience: Any degree with 5 - 9 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Progress Semaphore , or a similar tool like Pool Party etc Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc. ). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications : Databricks Certificate preferred , Progress Semaphore SAFe Practitioner Certificate preferred Any Data Analysis certification (SQL, Python) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams

Posted 1 month ago

Apply

5.0 - 9.0 years

5 - 20 Lacs

Hyderabad, Telangana, India

On-site

Roles & Responsibilities: Responsible for cataloging all external data assets, including the harmonization of metadata to increase reuse and inform future data acquisitions. Co-develop and maintain the process to consistently capture external data purchase forecast, focusing on generating the required metadata to support KPIs and reporting. Responsible for working with Global Strategic Sourcing and Cyber Security teams to standardize data contracts to enable the reuse of data assets across functions. In partnership with functional data SMEs, develop internal expertise on the content of external data to increase reuse across teams. This includes, but is not limited to, participating in data seminars to bring together data SMEs from all functions to increase data literacy. In partnership with the Data Engineering team, design data standardization rules to make external data FAIR from the start. Maintain the quality of data. In partnership with the Data Privacy and Policy team develop and operationalize data access controls to adhere to the terms of the data contracts to ensure data access controls, compliance, and security requirements are enforced. Maintain policies and ensure compliance with data privacy, security, and contractual policies Publish metrics to measure effectiveness of data reuse, data literacy and reduction in data spend. Functional Skills: Must-Have Skills: Experience managing external data assets used in the life-science industry (e. g. , Claims, EHR, etc. ) Experience working with data providers, supporting negotiations and vendor management activities. Technical data management skills with in-depth knowledge of Pharma data standards and regulations. Aware of industry trends and priorities and can apply to governance and policies. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Good-to-Have Skills: Ability to successfully execute complex projects in a fast-paced environment and in managing multiple priorities effectively. Ability to manage projects or departmental budgets. Experience with modelling tools (e. g. , Visio). Basic programming skills, experience in data visualization and data modeling tools. Experience working with agile development methodologies such as Scaled Agile. Soft Skills: Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any degree with 5 - 9 years of experience in Business, Engineering, IT or related field

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Must-Have Skills : - Proficiency in SQL and Python/R for data manipulation and analysis. - Experience with Machine Learning (ML) and NLP techniques. - Strong understanding of EDA and statistical modelling. - Hands-on experience with Google Cloud Platform (GCP) and Vertex AI. - Excellent communication and presentation skills. - Solid grasp of data engineering principles, including ETL processes.

Posted 1 month ago

Apply

10.0 - 15.0 years

5 - 7 Lacs

Noida, Uttar Pradesh, India

On-site

Key Responsibilities: Data Modeling : Develop conceptual , logical , and physical data models for various domains within the organization using industry best practices and standards. Collaboration with Stakeholders : Work closely with business and functional teams to translate their data requirements into effective data models. Subject Matter Expert : Serve as a subject matter expert in data modeling tools (e.g., ERwin Data Modeler ), providing guidance to other team members and stakeholders. Standardization : Establish and maintain standardized data models across portfolios and domains, ensuring consistency and governance . Model Optimization : Identify and implement opportunities to optimize existing data models , especially in critical areas such as fraud , banking , and AML (anti-money laundering). Consulting Services : Provide consulting on data modeling tool usage and administration, ensuring seamless data flow and application connections . Training and Support : Develop and deliver training content and support materials to help stakeholders understand and utilize data models effectively. Governance Framework : Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization, focusing on long-term automated monitoring solutions . Qualifications: Education : Bachelor's or Master's degree in Computer Science , Information Systems , or a related field. Experience : At least 10 years of experience as a Data Modeler or in a similar role, preferably within a large enterprise environment. Prior experience in Enterprise Data Modeling . Domain experience in Insurance or Investment Banking is preferred. Technical Skills : Expertise in data modeling concepts and methodologies. Proficiency with ERwin Data Modeler and other data modeling tools . Hands-on experience with database environments like Snowflake and Netezza . Strong SQL skills. Experience in data warehousing , ETL processes , and big data technologies . Familiarity with cloud data services . Soft Skills : Strong analytical and problem-solving skills . Excellent communication and collaboration skills. Ability to work effectively with cross-functional teams and stakeholders .

Posted 1 month ago

Apply

15.0 - 25.0 years

55 - 70 Lacs

Bengaluru, Karnataka, India

On-site

Description Job Summary: We are looking for a seasoned AI & Data Science Architect to lead and support the Bank's strategic initiatives in AI and Data Science. This role requires a visionary technologist with hands-on experience in designing scalable AI/ML systems, translating complex business challenges into AI-driven solutions, and driving MLOps best practices. The ideal candidate will collaborate across business and technical teams to ensure robust data pipelines, model lifecycle governance, infrastructure scalability, and responsible AI adoption. Key Responsibilities: Architect end-to-end AI and Data Science solutions that ensure scalability, performance, and security. Design robust AI/ML infrastructures, incorporating data pipelines, model training, deployment, and monitoring frameworks. Translate business needs into AI strategies and corresponding technical solutions. Lead the model lifecycle including design, development, deployment, and monitoring with full MLOps integration. Champion MLOps practices such as CI/CD for models, reproducibility, versioning, and model governance. Collaborate with cross-functional teamsincluding data engineers, ML practitioners, and domain expertsto align AI initiatives with enterprise objectives. Ensure all AI solutions comply with regulatory, ethical, and security standards including explainability and data privacy. Skills and Qualifications 15-25 years of experience in AI, data science, or a related field. Proficiency in programming languages such as Python, R, and Java. Extensive experience with machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Strong understanding of data engineering concepts, data warehousing, and ETL processes. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their AI services. Knowledge of big data technologies (e.g., Hadoop, Spark) and data visualization tools (e.g., Tableau, Power BI). Proven ability to work with large datasets and derive actionable insights from them. Excellent problem-solving skills and a strong analytical mindset. Effective communication skills to convey complex technical concepts to non-technical stakeholders. Leadership & Project Management: Team Leadership: Ability to mentor and lead multidisciplinary AI/DS teams. Project Execution: Skilled in estimating, planning, and executing technical projects in enterprise environments. Stakeholder Engagement: Strong communication skills to translate business objectives into technical deliverables and act as a liaison between technical teams and business stakeholders.

Posted 1 month ago

Apply

10.0 - 12.0 years

5 - 7 Lacs

Delhi, India

On-site

Key Responsibilities: Data Architecture: Lead the design, development, and implementation of comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. Data Transformation & ETL: Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. Customer-Centric Data Design: Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. Data Modeling: Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models to support analytical and operational needs. Query Optimization: Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. Data Warehouse Management: Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. Tool Evaluation & Implementation: Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. Business Requirements & Analysis: Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. Reporting & Analytics Support: Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. Software Development Practices: Apply professional software development principles and best practices to data solution delivery. Stakeholder Collaboration: Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. Project Management & Multi-tasking: Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. Strategic Thinking & Leadership: Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements: Strong experience with data transformation & ETL on large datasets. Experience with designing customer-centric datasets (e.g., CRM, Call Center, Marketing, Offline, Point of Sale). 5+ years of experience in Data Modeling (e.g., Relational, Dimensional, Columnar, Big Data). 5+ years of experience with complex SQL or NoSQL queries. Extensive experience in advanced Data Warehouse concepts. Proven experience with industry ETL tools (e.g., Informatica, Unifi). Solid experience in Business Requirements definition, structured analysis, process design, and use case documentation. Experience with Reporting Technologies (e.g., Tableau, PowerBI). Demonstrated experience in professional software development. Exceptional organizational skills with the ability to manage multiple simultaneous customer projects. Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. Must be self-managed, proactive, and customer-focused. Technical Skills: Cloud Platforms: Microsoft Azure Data Warehousing: Snowflake ETL Methodologies: Extensive experience in ETL processes and tools Data Transformation: Large-scale data transformation Data Modeling: Relational, Dimensional, Columnar, Big Data Query Languages: Complex SQL, NoSQL ETL Tools: Informatica, Unifi (or similar enterprise-grade tools) Reporting & BI: Tableau, PowerBI

Posted 1 month ago

Apply

12.0 - 17.0 years

2 - 6 Lacs

Hyderabad, Telangana, India

On-site

Roles & Responsibilities: Architect &Implementscalable, high-performance, Enterprise ScaleModern DataPlatforms &applications that include data analysis, dataingestion,storage,data transformation (data pipelines), and analytics. Evaluate thenew trends& featuresindataplatformsareaand build rapid prototypes Build Data Solution Architecturesand Frameworks to accelerate the Data Engineering processes Buildframeworks to improve the re-usability, reduce the developmenttime and cost of datamanagement & governance Integrate AIinto dataengineering practices to bring efficiency through automation Build best practices in DataPlatformscapability and ensure their adoption across the product teams Build and nurture strong relationships with stakeholders, emphasizing value-focused engagement and partnership to align data initiatives with broader business goals. Lead and motivate a high-performing dataplatformsteam to deliver exceptional results. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and best practices. Collaborate with counterparts inUSand EU and work with business functions, functional IT teams, and others to understand their data needs and ensure the solutions meet the requirements. Engage with business stakeholders to understand their needs and priorities, ensuring that data and analytics solutions built deliver real value and meet businessobjectives. Drive adoption of the data and analyticsplatforms & Solutionsby partnering with the business stakeholders and functional IT teams in rolling out change management, trainings, communications, etc. Talent Growth & People Leadership: Lead, mentor, and manage a high-performing team of engineers, fostering an environment that encourages learning, collaboration, and innovation. Focus on nurturing future leaders andprovidinggrowth opportunities through coaching, training, and mentorship. Recruitment & Team Expansion: Develop a comprehensive talent strategy that includes recruitment, retention, onboarding, and career development and build a diverse and inclusive team that drives innovation, aligns with Amgenscultureand values, and delivers business priorities Organizational Leadership: Work closely with senior leaders within the function and across the Amgen India site to align engineering goals with broader organizationalobjectivesanddemonstrateleadership by contributing to strategic discussions What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional weseekis a [type of person] with these qualifications. Basic Qualifications: 12 to 17 years with computer science and engineering preferred, other Engineering fields will be considered 10+ years of experienceinbuildingDataPlatforms, DataEngineering, working in COE development or product building 5+ years ofHands-onexperience working with Big DataPlatforms & Solutionsusing AWS and Databricks 5+ years of experience in leading enterprise scale dataengineeringsolution development. Experience buildingenterprise scale data lake, data fabric solutions on cloudleveragingmodernapproaches like Data Mesh Demonstratedproficiencyinleveragingcloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Hands-on experience using Databricks,PySpark, Python, SQL Proven ability to lead and develop high-performing data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experience in Integrating AI with DataPlatforms &Engineering and building AI ready dataplatforms Prior experience in datamodelingespecially star-schema modeling concepts. Familiarity with ontologies, information modeling, and graph databases. Experience working with agile development methodologies such as Scaled Agile. Experienced with software engineeringbest-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Education and Professional Certifications SAFefor Teams certification (preferred) Databricks certifications AWS cloud certification Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Design & Development Design, develop, and maintain Qlik Sense dashboards, reports, and data models Create interactive and intuitive visualizations that communicate actionable business insights Develop ETL processes using Qlik scripting for data extraction, transformation, and loading Optimize data models and dashboards for usability, scalability, and performance Requirements Gathering & Collaboration Work closely with stakeholders to gather and understand business and reporting requirements Translate business needs into technical specifications and Qlik data models Collaborate with cross-functional teams including data engineers, DBAs, and analysts for seamless data integration Data Management & Integration Connect Qlik Sense to various data sources: SQL Server, Oracle, Excel, CSV, cloud platforms (AWS, Azure, GCP), and APIs Ensure data accuracy, integrity, consistency , and security across the lifecycle Implement robust data validation and QA processes Support & Maintenance Provide ongoing support , issue resolution, and dashboard maintenance Identify and resolve performance bottlenecks and data issues Maintain technical documentation : data flow diagrams, dictionaries, user guides Train end-users and conduct knowledge transfer sessions Best Practices & Continuous Improvement Follow Qlik Sense development best practices and coding standards Stay current with new Qlik features, extensions, and BI trends Contribute to the evolution of BI processes and tool adoption Qualifications and Skills: Essential Qualifications Bachelor's degree in Computer Science, Information Systems , or a related field 3+ years of hands-on Qlik Sense development experience Strong command of Qlik Scripting , data modeling (star/snowflake schema), and visualization best practices Proficiency in SQL and relational database principles Experience with data integration from multiple sources (databases, files, APIs) Excellent analytical thinking and ability to translate business needs into solutions Strong communication and interpersonal skills Ability to work both independently and in a collaborative environment Desirable Qualifications Experience with Qlik NPrinting for report automation Familiarity with QlikView and migration projects Exposure to cloud platforms (AWS, Azure, GCP) Knowledge of Agile methodologies (Scrum, Kanban) Basic understanding of ETL processes, data warehousing , and dimensional modeling Qlik Sense certifications are a plus

Posted 1 month ago

Apply

10.0 - 12.0 years

5 - 7 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities: Data Architecture: Lead the design, development, and implementation of comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. Data Transformation & ETL: Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. Customer-Centric Data Design: Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. Data Modeling: Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models to support analytical and operational needs. Query Optimization: Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. Data Warehouse Management: Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. Tool Evaluation & Implementation: Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. Business Requirements & Analysis: Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. Reporting & Analytics Support: Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. Software Development Practices: Apply professional software development principles and best practices to data solution delivery. Stakeholder Collaboration: Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. Project Management & Multi-tasking: Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. Strategic Thinking & Leadership: Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements: Strong experience with data transformation & ETL on large datasets. Experience with designing customer-centric datasets (e.g., CRM, Call Center, Marketing, Offline, Point of Sale). 5+ years of experience in Data Modeling (e.g., Relational, Dimensional, Columnar, Big Data). 5+ years of experience with complex SQL or NoSQL queries. Extensive experience in advanced Data Warehouse concepts. Proven experience with industry ETL tools (e.g., Informatica, Unifi). Solid experience in Business Requirements definition, structured analysis, process design, and use case documentation. Experience with Reporting Technologies (e.g., Tableau, PowerBI). Demonstrated experience in professional software development. Exceptional organizational skills with the ability to manage multiple simultaneous customer projects. Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. Must be self-managed, proactive, and customer-focused. Technical Skills: Cloud Platforms: Microsoft Azure Data Warehousing: Snowflake ETL Methodologies: Extensive experience in ETL processes and tools Data Transformation: Large-scale data transformation Data Modeling: Relational, Dimensional, Columnar, Big Data Query Languages: Complex SQL, NoSQL ETL Tools: Informatica, Unifi (or similar enterprise-grade tools) Reporting & BI: Tableau, PowerBI

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities: Design and Development: Design, develop, and maintain Qlik Sense dashboards, reports, and data models. Create interactive visualizations to effectively communicate data insights. Develop and implement data extraction, transformation, and loading (ETL) processes using Qlik scripting. Optimize data models and dashboards for performance, scalability, and usability. Requirements Gathering and Collaboration: Work closely with business stakeholders to gather and document business requirements. Translate business requirements into technical specifications and data models. Collaborate with cross-functional teams, including data engineers, database administrators, and business analysts, to ensure seamless data integration and delivery. Data Management and Integration: Integrate Qlik Sense with various data sources, including relational databases (SQL Server, Oracle, etc.), flat files (Excel, CSV), cloud platforms (AWS, Azure, GCP), and APIs. Ensure data accuracy, integrity, consistency, and security throughout the data lifecycle. Implement data validation and quality assurance processes. Support and Maintenance: Provide ongoing support, troubleshooting, and maintenance for existing Qlik Sense dashboards and solutions. Identify and resolve performance bottlenecks and data-related issues. Create and maintain technical documentation, including data flow diagrams, data dictionaries, and user guides. Provide training and knowledge transfer to end-users. Best Practices and Continuous Improvement: Adhere to Qlik Sense development best practices and coding standards. Stay up-to-date with the latest Qlik Sense features, functionalities, and industry trends. Contribute to the improvement of BI processes and methodologies. Qualifications and Skills: Essential Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field. Minimum 3 years of hands-on experience with Qlik Sense development, including scripting, data modeling, and application development. Strong knowledge of Qlik Scripting, data modeling techniques (star schema, snowflake schema), and data visualization best practices. Proficiency in SQL and relational database concepts. Experience in connecting Qlik Sense to various data sources (e.g., SQL databases, Excel, CSV, APIs). Strong analytical and problem-solving skills with the ability to translate complex business requirements into technical solutions. Excellent communication and interpersonal skills to effectively collaborate with business users and technical teams. Ability to work independently and as part of a team. Desirable Qualifications: Experience with Qlik NPrinting for report distribution and automation. Knowledge of QlikView. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data warehousing solutions. Familiarity with Agile development methodologies (e.g., Scrum, Kanban). Basic understanding of data warehousing concepts, ETL processes, and dimensional modeling. Qlik Sense certifications.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for designing and implementing ETL processes to extract, transform, and load data from different sources like databases, APIs, and flat files. Your main duties will include designing and implementing ETL processes, monitoring and optimizing them for performance, and efficiency. You will also be required to document ETL processes and maintain technical specifications. To qualify for this role, you should have 4-8 years of experience in ETL development. Proficiency in ETL tools and frameworks such as Apache NiFi, Talend, or Informatica is essential. Strong programming skills in Python are also required, along with experience in data warehousing concepts and methodologies. Preferred qualifications include certifications in relevant ETL tools or data engineering.,

Posted 1 month ago

Apply

6.0 - 15.0 years

0 Lacs

karnataka

On-site

We are looking for experienced BODS Consultants with a solid background in Data Migration to be a part of our global consulting team. This remote position offers the chance to contribute to impactful projects spanning various industries like Financial Services, Pharmaceuticals & Life Sciences, Manufacturing, and Utilities. Your responsibilities will include delivering top-notch technology and business solutions across diverse industry domains, developing and implementing data migration deliverables independently, transforming functional specifications into technical solutions using SAP BODS, performing data profiling, cleansing, transformation, and validation activities, as well as conducting data analysis, documentation, and reporting in line with project requirements. You will collaborate with onsite/offshore teams on large-scale ERP or transformation projects, customize and configure data solutions according to client needs, mentor junior consultants, and ensure timely and accurate delivery of all assigned work products. Moreover, you will maintain adherence to organizational policies, compliance, and security standards, and engage in continuous professional development through training and client interactions. To be eligible for this role, you should hold a Bachelors degree in Computer Science, Information Systems, or a related field, along with 6 to 15 years of experience in SAP BODS focusing on data migration projects. Hands-on experience in full-cycle ERP implementations (SAP preferred), familiarity with data profiling, quality analysis, and ETL processes, proficiency in identifying and resolving data-related issues in complex environments, strong analytical, problem-solving, and communication skills, as well as the ability to work independently and as part of a cross-functional global team are essential requirements. If you meet the above qualifications and can join within 30 days, we invite you to apply today. This is a unique opportunity to advance your career with a company that values innovation, integrity, and excellence in data solutions.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a BI Developer at SourceMash Technologies, you will be responsible for designing and delivering data-driven solutions using tools like Power BI, Tableau, or Qlik. Your role will involve developing dashboards, reports, and data models, as well as building ETL processes and optimizing SQL queries. Collaboration with business teams to translate requirements into actionable insights and ensuring data integrity across systems will be a key aspect of your responsibilities. With at least 3 years of hands-on experience in Business Intelligence or Data Analytics roles, you will leverage your strong expertise in BI tools such as Power BI, Tableau, QlikView, or similar platforms. Experience in building and optimizing data models (Star/Snowflake schema) for reporting purposes will be crucial, along with a good understanding of ETL processes and tools like SSIS, Informatica, Talend, or Azure Data Factory. Your excellent communication skills will be put to use as you work with cross-functional teams to design, develop, and deploy interactive dashboards and reports. Writing optimized SQL queries, stored procedures, and data models will be part of your routine tasks to ensure data accuracy, integrity, and performance across reporting systems. Developing and managing KPIs and metrics that align with business goals, as well as troubleshooting BI tools and systems, will be integral to ensuring smooth business operations. At SourceMash Technologies, you will have the opportunity to be part of a leading solution provider for internet-based applications and product development. Our company, established in 2008, is driven by highly skilled professionals dedicated to offering total IT solutions under one roof, covering Software Development, Quality Assurance, and Support services. Joining our team comes with benefits such as an employee welcome kit that includes items like a custom notepad, t-shirt, water bottle, and more. Additionally, we offer the best employee health insurance benefits for you and your family members under the same policy. Paid leaves are also part of our package to ensure a healthy work-life balance. If you are passionate about leveraging BI tools to drive data-driven solutions and want to work in a collaborative environment with opportunities for growth, then we welcome you to apply for this exciting BI Developer position at SourceMash Technologies in Bangalore.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies