Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Power BI Specialist at our offshore location, you will be joining our analytics team with the primary responsibility of developing, designing, and maintaining Power BI reports, dashboards, and visualizations to facilitate data-driven decision-making. Your expertise in data modeling will be crucial in creating efficient and scalable data models at both the Power BI and data warehouse levels. Your analytical mindset and strong debugging skills will be essential in handling large datasets, identifying patterns, and transforming raw data into actionable insights. Your key responsibilities will include collaborating with stakeholders to gather requirements, designing data models that align with business objectives, and utilizing advanced DAX functions and Power Query to optimize data transformations and calculations. You will also be expected to analyze and interpret large datasets, identify trends and patterns, and make expert decisions on Power BI connectivity modes based on data volume and performance requirements. In addition, you will oversee the deployment and management of Power BI Service, including security, data refresh, and performance tuning. You will also be responsible for SQL-based data extraction and manipulation to meet Power BI requirements, as well as debugging and resolving issues in Power BI reports, SQL code, and data to ensure data accuracy, performance optimization, and consistent problem resolution. Your basic knowledge of ETL processes will further enhance data transformation and integration. To be successful in this role, you should have a minimum of 6+ years of experience in Power BI, with a strong background in data modeling for both Power BI and data warehouses. An in-depth understanding of Power BI tools and BI processes, excellent communication skills, and proven ability to design, develop, and optimize data models for large datasets and complex reporting needs are also required. Strong analytical and debugging skills, proficiency in DAX and Power Query, advanced SQL skills, and a good understanding of ETL processes are essential qualifications for this position. Nice to have skills include familiarity with integrating Power BI reports into other applications or platforms, knowledge of Power Apps and Power Automate for automation, experience with Azure Data Factory (ADF) for data integration, and a basic understanding of Python programming.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be responsible for developing and maintaining BI reports and dashboards, writing SQL queries for data extraction, manipulation, and analysis, troubleshooting BI tools, and resolving data-related issues. Additionally, you will support data cleansing and validation efforts. As a Full-Time employee, you are required to hold a Bachelor of Engineering in IT, while a Master's in Computer Science or a related field would be considered an added advantage. To excel in this role, you should possess advanced proficiency in SQL, data warehousing, and ETL processes. Extensive experience with BI tools like Power BI, Tableau, QlikView, or similar is essential. You must have a strong understanding of database design, data modeling best practices, expertise in Excel, and advanced data visualization techniques. Your ability to analyze complex datasets, provide actionable insights, optimize data pipelines for performance and scalability, and work with databases like MS SQL, MySQL, or Oracle is crucial. Furthermore, you will be involved in designing and developing complex data models and BI solutions, advanced SQL querying, building interactive dashboards and visualizations, ETL pipeline design and automation, leading data governance efforts, and performance tuning of BI tools. This position is based in Mumbai, and the salary offered will be as per industry standards. The details of compensation and other benefits will be communicated during the hiring process. Envecon Group is an equal opportunity employer that values diversity in the workplace and encourages candidates from all backgrounds to apply. To apply for this position, please send your CV to tyrell.hewett@envecon.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
As an Informatica Developer at Viraaj HR Solutions, you will be responsible for developing and implementing ETL processes using Informatica PowerCenter. Collaborating with data architects and business analysts, you will gather requirements for data integration projects. Your role will involve data mapping and transformation to ensure data accuracy and consistency, as well as maintaining and optimizing existing ETL workflows for improved performance. Additionally, you will conduct data profiling and analysis to identify data quality issues, implement best practices for data governance and compliance, and develop technical documentation for ETL processes. Monitoring and troubleshooting ETL processes to promptly resolve issues, engaging in performance tuning of Informatica mappings and sessions, and working within a team to deliver high-quality data solutions on time are also key responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 3 years of experience as an Informatica Developer. Strong knowledge of Informatica PowerCenter, proficiency in SQL, experience with data warehousing concepts, and familiarity with ETL processes and data cleansing practices are essential qualifications. Additionally, having a solid understanding of database systems such as Oracle or SQL Server, excellent problem-solving skills, and the ability to work effectively in a team-oriented environment are crucial. Strong written and verbal communication skills, experience with version control tools (e.g., Git), knowledge of data profiling tools and techniques, and the ability to handle multiple tasks and projects simultaneously are also desired. Familiarity with Agile methodologies, certifications in Informatica or related technologies, and a willingness to learn new tools and technologies as needed will be advantageous for this role. If you are passionate about informatica, powercenter, SQL, performance tuning, SQL Server, technical documentation, Oracle, Agile methodologies, database systems, data profiling, data mapping, problem-solving, team collaboration, version control, data transformation, ETL processes, and data warehousing, this position offers an exciting opportunity to contribute to impactful data integration projects and enhance your skills in a dynamic environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
thanjavur, tamil nadu
On-site
You will be responsible for managing and optimizing PLM and MES systems to enhance manufacturing processes. This includes handling large volumes of structured and unstructured data, identifying and overcoming technical difficulties and integration issues in previous projects, and implementing robust data security measures to protect sensitive information. Your role will also involve designing and maintaining data warehouse architecture and optimizing Power BI performance, focusing on storage mode and response time. In addition, you will need to have a good understanding of manufacturing workflows, production planning, and quality control. You will be expected to generate comprehensive reports based on previous project requirements and communicate effectively with business users to understand and meet their needs. Managing activities related to Google Cloud Platform (GCP) and Extract, Transform, Load (ETL) processes will also be part of your responsibilities. The ideal candidate should have familiarity with Google Cloud Platform (GCP) and e-commerce analytics, along with a minimum of 3 years of experience in data engineering, data analysis, or a similar role, preferably in e-commerce. Proven experience in managing data within manufacturing processes, strong knowledge of PLM and MES systems, and proficiency in data security implementation and data warehouse architecture are mandatory requirements. Excellent communication skills and the ability to work with business users are essential. You should possess strong knowledge in GCP and ETL processes, as well as proficiency in SQL and Python (or R) for data extraction, transformation, and analysis. Experience with Snowflake, DBT data modelling, Dagster, and ETL/ELT processes is preferred. Advanced proficiency in Power BI for creating interactive dashboards and reports is also necessary. Familiarity with GA4, Google Ads, Meta Ads, Meta S2S, and Braze Marketing APIs, as well as experience with cloud platforms and data integration tools, including Google BigQuery or AWS, will be advantageous. Experience with e-commerce analytics, such as funnel analysis, cohort analysis, and attribution modelling, is a plus. Qualifications required for this role include a Bachelor's degree in a relevant field (e.g., Computer Science, Engineering, Data Management). Please note that candidates from Tamilnadu location are preferred. Only shortlisted profiles will be contacted for further consideration.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer (SQL & Snowflake) at Cygnus Professionals, Inc., you will be responsible for designing, developing, and maintaining data pipelines and ETL processes using SQL and Snowflake. In this Contract-to-Hire (CTH) role, you will leverage your 4-5 years of experience in data engineering to build and optimize data models for efficient storage and retrieval in Snowflake. Your expertise in data modeling, ETL processes, and cloud-based data platforms will play a crucial role in ensuring data quality, integrity, and security across all pipelines. Collaborating closely with business stakeholders, you will understand data requirements and implement scalable solutions that enable data-driven decision-making. Your responsibilities will include optimizing query performance in Snowflake to handle large datasets efficiently, troubleshooting data-related issues, and ensuring smooth data flow across systems. Additionally, you will work with data analysts, data scientists, and software engineers to implement best practices in data governance, security, and compliance, particularly within the pharmaceutical industry. To excel in this role, you should possess strong skills in writing complex SQL queries, performance tuning, and query optimization. Experience with ETL/ELT tools, data pipeline orchestration, data warehousing concepts, and cloud-based architectures is essential. Familiarity with Python, DBT, or other scripting languages for data transformation is a plus. Prior experience in the pharmaceutical industry or working with healthcare-related data is preferred. Your problem-solving skills, ability to work in a fast-paced environment, and excellent communication and collaboration skills will be key to your success in this role. Cygnus Professionals, Inc. is a global Business IT consulting and software services firm headquartered in Princeton, NJ, with offices in the USA and Asia. For over 15 years, Cygnus has been dedicated to enabling innovation, accelerating time to market, and fostering growth for clients while maintaining deep relationships. For more information about Cygnus, visit www.cygnuspro.com. If you are a dynamic and experienced Data Engineer with a passion for SQL and Snowflake, we invite you to join our team in Chennai or Mumbai and contribute to our mission of driving success through data-driven solutions in the pharma industry.,
Posted 1 week ago
3.0 - 15.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You will be joining Nallas, a software services and product company that specializes in providing data, cloud, product, and quality engineering services for enterprises looking to undergo transformation with predictability. As a valued partner to our customers, we collaborate on high-priority projects where failure is not an option. We are selective in working with customers who seek a partner equally committed to achieving successful business outcomes. What sets us apart is our unwavering focus and investment in our customers" business objectives, our commitment to delivering with stable, static, and accountable teams, and our dedication to offering unmatched transparency and predictability in our services. At Nallas, we are at the forefront of working with cutting-edge technologies such as Apache, Kubernetes, Snowflake, Delta Lake, Containerization, Multi-cloud architecture, AI, Machine learning, AR/VR, Blockchain, IoT, Test Automation, and DevOps. Through our services, we aim to elevate the data quality, scalability, efficiency, and security for our clients while helping them reduce costs and improve overall quality. We operate within Agile development frameworks like Scrum and Kanban, allowing us to provide rapid results to our customers and adjust to the ever-evolving market landscape. We are currently seeking a candidate with 7-15 years of experience to join our team in Coimbatore. The ideal candidate should possess the following qualifications: - At least 3 years of experience in thorough testing and validation of ETL processes and workflows. - Proficiency in automation tools such as Selenium WebDriver with Java or C#. - Hands-on experience in testing web services/API using automation tools like JMeter, rest assured, etc. - Familiarity with testing in a cloud environment, particularly in AWS, would be beneficial. If you are enthusiastic about working with the latest technologies and eager to advance your career to the next level, Nallas offers an exceptional work environment that nurtures both your personal and professional growth. Expect to be part of a collaborative, innovative, and team-oriented atmosphere where your contributions are valued. To apply for this exciting opportunity, please send your resume to karthicc@nallas.com. Join us at Nallas and be a part of our dynamic team driving impactful transformations through technology.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will partner with business users to understand their analytical needs and translate them into technical requirements for SAP Analytics Cloud (SAC). Your role will involve designing and developing robust data models, leveraging your expertise in SAP HANA and data modeling best practices. You will be responsible for configuring and customizing SAC stories, dashboards, and applications to deliver impactful visualizations and user experiences. Building automated data flows and ETL processes to ensure clean and consistent data for analysis will be a key part of your responsibilities. Additionally, you will integrate SAC with various data sources, including SAP and non-SAP systems, using your knowledge of APIs and data connectors. Performing system administration tasks within SAC to ensure optimal performance and security will also be part of your duties. Supporting end-users through knowledge transfer sessions and troubleshooting technical issues will be essential. Staying up-to-date on the latest SAC features and functionalities and proactively proposing innovative solutions will be expected. Your qualifications for this role include a Bachelor's degree in Computer Science, Information Technology, Business Intelligence, or a related field (or equivalent experience). You should have a minimum of 3+ years of experience working with SAP Analytics Cloud (SAC). A strong understanding of data modeling concepts and techniques is required, along with proficiency in SQL and experience with data manipulation tools.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As a Data Visualization Specialist at Boston Scientific, you will have the opportunity to utilize your skills in transforming complex data into clear, interactive, and visually compelling insights. Collaborating with data analysts, business intelligence teams, and stakeholders, you will play a crucial role in presenting data-driven insights effectively to guide critical business decisions. Based in Gurugram, India, your key responsibilities will include developing intuitive visualizations using tools like Tableau and Power BI, collaborating with cross-functional teams to understand data sources, designing real-time interactive dashboards, and ensuring data accuracy and consistency through data transformation. Additionally, you will create visual narratives tailored to different audiences, learn and leverage programming languages for data manipulation, and optimize visualizations for performance and scalability. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Data Science, or a related field, along with a minimum of 7 years of experience in data visualization or business intelligence. Certification in Tableau or Power BI is required. Required technical skills include proficiency in designing interactive visualizations and dashboards, advanced SQL skills, familiarity with libraries like Plotly and ggplot2, knowledge of data visualization best practices, and experience with large datasets and ETL processes. Familiarity with UX/UI principles, version control tools, and cloud platforms for visualization and BI services will be an advantage. Effective communication and storytelling abilities are essential for translating technical data insights for non-technical audiences. Preferred experience in Supply Chain Management, Sales, or Finance domains will be beneficial. If you are a natural problem-solver with the determination to make a meaningful impact on a global scale, we invite you to apply for this position and join Boston Scientific in advancing science for life and transforming lives through innovative medical solutions.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a SAP BI/BW Consultant, you will collaborate with business stakeholders to understand analytics requirements and objectives. You will design and develop SAP BI/BW data models and architecture, create and manage ETL processes for data extraction, transformation, and loading, and develop and maintain SAP BI/BW reports, dashboards, and visualizations. Your role will involve optimizing system performance, recommending improvements for data quality and accuracy, and providing expertise in SAP Analytics Cloud and other SAP BI tools. Additionally, you will support and train users in utilizing SAP BI/BW solutions effectively, participate in project planning, estimation, and risk management activities, and conduct data analysis to provide actionable insights for business growth. It will be essential for you to stay updated with SAP BI/BW best practices and industry trends. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field. You must have proven experience in implementing SAP BI/BW solutions in a consulting environment, expertise in SAP data modeling, ETL processes, and report development, and strong knowledge of SAP HANA, SAP Analytics Cloud, and SAP Business Objects. Proficiency in ABAP programming for customizing SAP BI/BW solutions, excellent analytical and problem-solving skills, and the ability to communicate effectively with stakeholders and team members are also required. Project management skills with the ability to handle multiple priorities, relevant SAP certifications, and an understanding of data governance and security best practices will be advantageous. Key Skills: analytics, SAP Business Objects, data security, project management, SAP BI, SAP NetWeaver Business Warehouse (SAP BW), data governance, SAP, ETL processes, SAP Analytics Cloud, SAP BI/BW data models, ABAP programming, data modeling,
Posted 1 week ago
5.0 - 10.0 years
10 - 16 Lacs
mumbai, mumbai suburban, mumbai (all areas)
Work from Office
Location: Mumbai (Andheri East) Designation: Data Engineering/Analytics Job Summary: We are looking for a Data Engineer with up to 5 years of experience in ETL processes to lead the migration of MongoDB data to MSSQL. The ideal candidate will design, develop, and optimize data pipelines, ensuring seamless data transformation, integrity, and performance. Key Responsibilities: Design and implement ETL workflows for migrating data from MongoDB to MSSQL. Develop data transformation and cleansing processes to ensure data integrity. Optimize data pipelines for performance, scalability, and reliability. Collaborate with database administrators and business teams to define migration strategies. Work with SQL Server tools (SSIS, T-SQL) for data integration. Monitor and troubleshoot ETL workflows to ensure smooth operations. Maintain documentation for migration processes, data models, and pipeline configurations. Required Skills & Qualifications: Proficiency in MongoDB and MSSQL (schema design, indexing, query optimization). Strong knowledge of ETL tools (SSIS, Talend, Informatica). Experience in data migration strategies and performance tuning. Expertise in SQL Server (stored procedures, triggers, views). Understanding of data governance, security, and compliance. Preferred Qualifications: Experience with cloud-based data migration (AWS RDS). Knowledge of MongoDB aggregation framework. Exposure to CI/CD pipelines for database deployments.
Posted 2 weeks ago
5.0 - 10.0 years
4 - 7 Lacs
bengaluru, karnataka, india
On-site
Extensive experience of 5+ in Power BI with overall experience of 10 plus years in BI Extensive experience of BI tools like Power BI, SSRS, Tableau etc. Hands-on experience in RDBMS platforms i.e. SQL Server, Oracle etc. Experience in Design solutions for aggregated facts using metadata. In-depth knowledge of database and BI architecture design Worked on reports dashboards migration from tradition to modern BI tools like Power BI,Tableau etc. Managed high volumes of data at one time load Worked on Optimize data modelling, database and BI solutions Applying architectural and engineering concepts to design a solution that meets operational requirements, such as scalability, maintainability, security, reliability, extensibility, flexibility, availability and manageability Developing technology specifications and ensuring that new technology solutions are designed for optimal access and usefulness, and leveraging existing technologies when possible Good understanding of Data Visualization tools. Ability to develop Cloud Data and BI solutions as per enterprise requirement. Participate in development of business intelligence solutions Ability to create and maintain conceptual business, logical and physical data models. Experience in translating mapping relational data models into Data Base Schemas. Develop and create transformation queries, views, and stored procedures for ETL processes, and process automations
Posted 2 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
remote, india
On-site
Responsibilities: Design, develop, and maintain data infrastructure, databases, and data pipelines Develop and implement ETL processes to extract, transform, and load data from various sources Ensure data accuracy, quality, and accessibility, and resolve data-related issues Collaborate with data analysts, data scientists, and other stakeholders to understand data needs and requirements Develop and maintain data models and data dictionaries Design and implement data warehousing solutions to enable efficient and effective data analysis and reporting Implement and manage data security and access controls to protect data privacy and confidentiality Strong understanding of data architecture, data modeling, ETL processes, and data warehousing Excellent communication and collaboration skills
Posted 2 weeks ago
6.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The Precision Medicine technology team at Amgen is dedicated to developing Data Searching, Cohort Building, and Knowledge Management tools to provide visibility to Amgen's extensive human datasets, projects, study histories, and scientific findings. The team focuses on managing multiomics data, clinical study subject measurements, images, and specimen inventory data. The PMED capabilities play a crucial role in Amgen's mission to accelerate discovery and speed to market of advanced precision medications. As a Solution and Data Architect, you will be responsible for designing an enterprise analytics and data mastering solution using Databricks and Power BI. This role demands expertise in data architecture and analytics to create scalable, reliable, and high-performing solutions for research cohort-building and advanced research pipelines. The ideal candidate will have experience in creating unified repositories of human data from multiple sources. Collaboration with stakeholders from data engineering, business intelligence, and IT teams is essential to design and implement data models, integrate data sources, and ensure data governance and security best practices. The role requires a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Key Responsibilities: - Architect scalable enterprise analytics solutions using Databricks, Power BI, and modern data tools. - Utilize data virtualization, ETL, and semantic layers to balance unification, performance, and data transformation. - Support development planning by aligning features with architectural direction. - Participate in pilots and proofs-of-concept for new patterns. - Document architectural direction, patterns, and standards. - Train engineers and collaborators on architecture strategy and patterns. - Collaborate with data engineers to build and optimize ETL pipelines. - Design robust data models and processing layers for analytical processing and reporting. - Implement data governance, security, and compliance best practices. - Integrate data systems with enterprise applications for seamless data flow. - Provide thought leadership on data architecture and advanced analytics. - Develop and maintain optimized Power BI solutions. - Serve as a subject matter expert on Power BI and Databricks. - Define data requirements, architecture specifications, and project goals. - Evaluate and adopt new technologies to enhance data solutions. Basic Qualifications and Experience: - Master's degree with 6-8 years of experience OR Bachelor's degree with 8-10 years of experience OR Diploma with 10-12 years of experience in data management and solution architecture. Functional Skills: Must-Have Skills: - Hands-on experience with BI solutions and CDC ETL pipelines. - Expertise in Power BI, Databricks, and cloud platforms. - Strong communication and collaboration skills. - Ability to assess business needs and align solutions. Good-to-Have Skills: - Experience with human healthcare data and clinical trial data management. Professional Certifications: - ITIL Foundation or relevant certifications (preferred). - SAFe Agile Practitioner. - Microsoft Certified: Data Analyst Associate or related certification. - Databricks Certified Professional or similar certification. Soft Skills: - Excellent analytical and troubleshooting skills. - Intellectual curiosity and self-motivation. - Strong communication skills. - Confidence as a technical leader. - Ability to work effectively in global, virtual teams. - Strong problem-solving and analytical skills.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Epergne Solutions is seeking a Data Platform Engineer with over 6 years of total experience, including at least 5 years of relevant experience in Python and Airflow Data Engineering. The role is based in India and is a contract position with a work-from-office mode. As a Data Platform Engineer at Epergne Solutions, you will be responsible for designing, developing, and maintaining complex data pipelines using Python for efficient data processing and orchestration. You will collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment. Your role will also involve implementing data integration and transformation processes, optimizing existing data pipelines, and troubleshooting issues related to data pipelines to ensure smooth operation and minimal downtime. The ideal candidate should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with proficiency in Python and SQL for data processing and manipulation. You should have a minimum of 5 years of experience in data engineering, with a strong background in Apache Airflow and AWS technologies, particularly S3, Glue, EMR, Redshift, and AWS Lambda. Knowledge of Snowflake is preferred, along with experience in optimizing and scaling data pipelines for performance and efficiency. In addition to technical skills, you should possess excellent problem-solving abilities, effective communication skills, and the capacity to work in a fast-paced, collaborative environment. Keeping abreast of the latest industry trends and best practices related to data engineering and AWS services is crucial for this role. Preferred qualifications include AWS certifications related to data engineering or big data, experience with big data technologies like Snowflake, Spark, Hadoop, or related frameworks, familiarity with other data orchestration tools besides Apache Airflow, and knowledge of version control systems like Bitbucket and Git. Epergne Solutions prefers candidates who can join within 30 days or have a lesser notice period.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Tableau Developer at Mastech Digital in Chennai, Tamil Nadu, India, you will play a crucial role in designing, developing, and deploying robust Tableau solutions for global enterprises. Your expertise in data visualization and business intelligence will be essential in creating interactive dashboards and reports to provide impactful insights for data-driven decision-making. Your responsibilities will include developing visually compelling dashboards, implementing advanced calculations, and optimizing Tableau workbooks for performance efficiency. You will integrate data from diverse sources, perform data cleansing and transformation, and mentor junior team members on best Tableau development practices. Your technical skills should include extensive experience with Tableau Desktop and Server, proficiency in SQL, and familiarity with scripting languages like Python. Experience with data warehousing, ETL processes, and other reporting tools will be beneficial. With 6 to 8 years of experience in BI and data analysis, including at least 5 years in Tableau development, you should hold a Master's degree in Computer Science or a related field. Strong analytical, problem-solving, and communication skills are essential for this role. Joining Mastech Digital offers you the opportunity to work on challenging BI projects, a collaborative work environment, career growth opportunities, a competitive salary, and benefits package, and a hybrid work schedule. If you are passionate about Tableau development and eager to contribute to impactful BI initiatives, we welcome you to be part of our dynamic team at Mastech Digital.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chittoor, andhra pradesh
On-site
The role of Master Data Engineer at Weir Minerals involves carrying out data engineering tasks including data modeling, ETL processes, data warehousing, and data analytics to support the processing solutions offered to the mining industry. As a full-time on-site position based in Chittoor, you will play a crucial role in optimizing processes, reducing downtime, and increasing plant capacity for efficient processing and asset protection. To excel in this role, you should possess expertise in Data Engineering and Data Modeling, along with hands-on experience in Extract Transform Load (ETL) processes and Data Warehousing. Additionally, proficiency in Data Analytics is essential to effectively support the mining industry solutions provided by Weir Minerals. The ideal candidate for this position will have strong problem-solving and analytical skills, complemented by excellent communication and teamwork abilities. A Bachelor's or Master's degree in Computer Science, Data Science, or a related field is required to qualify for this role. If you are passionate about leveraging your technical expertise to drive innovation and efficiency in the mining industry, we encourage you to apply for the Master Data Engineer position at Weir Minerals.,
Posted 2 weeks ago
4.0 - 12.0 years
0 Lacs
karnataka
On-site
As a DAX Modeler specializing in Power BI, you will be an individual contributor based in Bangalore with a hybrid work location. You should have a minimum of 7 years of overall experience, out of which at least 4 years should be in DAX modeling. Your primary responsibilities will include DAX modeling in Power BI, data engineering tasks, SQL proficiency, and working with ETL processes. Key Responsibilities: - Utilize your strong expertise in DAX modeling with a minimum of 6 years of relevant experience. - Hands-on experience in Power BI for reporting and modeling purposes. - Work on data engineering tasks and demonstrate proficiency in SQL and ETL processes. - Experience with data warehousing, particularly with terabyte-scale data. - Familiarity with Azure and related data management tools for efficient data handling. Skills Required: - Expertise in DAX Modeling - Proficiency in Power BI - Data Engineering Exposure - Strong SQL Skills - Experience with Data Warehousing - Familiarity with Azure and related data management tools This is a full-time employment opportunity in a dynamic work environment. The ideal candidate should be an expert in DAX modeling, have confirmed skills in Azure, and possess advanced knowledge in ETL processes. If you meet these criteria and are comfortable working in the Belgium Time Zone with a notice period of up to 30 days, we look forward to receiving your application.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
The world's top banks use Zafin's integrated platform to drive transformative customer value. Powered by an innovative AI-powered architecture, Zafin's platform seamlessly unifies data from across the enterprise to accelerate product and pricing innovation, automate deal management and billing, and create personalized customer offerings that drive expansion and loyalty. Zafin empowers banks to drive sustainable growth, strengthen their market position, and define the future of banking centered around customer value. We are seeking a talented Senior Go Engineer to join our dynamic team. In this role, you will be responsible for designing, developing, and maintaining scalable applications using Go. While primary expertise in Go is required, experience in data engineering is a plus, as you may have the opportunity to work on data-related projects as well. Key Responsibilities: - Design, develop, and maintain high-performance, scalable applications using Go. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. - Write clean, maintainable, and efficient code, following best practices and coding standards. - Troubleshoot and debug applications to ensure optimal performance and reliability. - Participate in code reviews and mentor junior developers to foster a collaborative learning environment. - Work on data engineering projects, including data ingestion, transformation, and storage solutions. - Stay current with industry trends and emerging technologies to continuously improve our tech stack. Qualifications: Required: - 5+ years of professional experience in software development, with a strong focus on Go programming. - Proficient in building RESTful APIs and microservices. - Strong understanding of software development principles, design patterns, and best practices. - Experience with version control systems (e.g., Git) and agile methodologies. Preferred: - Familiarity with data engineering concepts and tools (e.g., ETL processes, data pipelines, data warehousing). - Experience with databases (SQL and NoSQL) and data storage solutions. - Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes). - Knowledge of streaming platforms (Kafka, RabbitMQ, JMS) Joining our team means being part of a culture that values diversity, teamwork, and high-quality work. We offer competitive salaries, annual bonus potential, generous paid time off, paid volunteering days, wellness benefits, and robust opportunities for professional growth and career advancement. Want to learn more about what you can look forward to during your career with us Visit our careers site and our openings: zafin.com/careers Zafin welcomes and encourages applications from people with disabilities. Accommodations are available on request for candidates taking part in all aspects of the selection process. Zafin is committed to protecting the privacy and security of the personal information collected from all applicants throughout the recruitment process. The methods by which Zafin contains uses, stores, handles, retains, or discloses applicant information can be accessed by reviewing Zafin's privacy policy at https://zafin.com/privacy-notice/. By submitting a job application, you confirm that you agree to the processing of your personal data by Zafin described in the candidate privacy notice.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
thane, maharashtra
On-site
About Welspun Welspun World is one of India's fastest growing global conglomerates with businesses in Home Textiles, Flooring Solutions, Advanced Textiles, DI Pipes, Pig Iron, TMT bars, Stainless Steel, Alloy, Line Pipes, Infrastructure & Warehousing. At Welspun, we strongly believe in our purpose to delight customers through innovation and technology, achieve inclusive & sustainable growth to remain eminent in all our businesses. From Homes to Highways, Hi-tech to Heavy metals, We lead tomorrow together to create a smarter & more sustainable world. Job Purpose/ Summary As a Solution Architect at Welspun, you will be responsible for analyzing business requirements and translating them into analytics solutions. You will architect, design, and implement scalable data analytics solutions using Databricks and other Azure cloud services. Additionally, you will lead data engineering initiatives, develop and optimize data models, and ensure efficient deployment of machine learning models. Collaboration with cross-functional teams, mentoring junior team members, and driving data-driven decision-making are also key responsibilities of this role. Responsibilities - Analyze business requirements and translate them into analytics solutions. - Architect, design, and implement scalable data analytics solutions using Databricks and other Azure cloud services. - Lead data engineering initiatives, including data pipelines, data lakes, and data warehousing solutions. - Develop and optimize data models for analytical and operational use cases. - Implement and drive MLOps best practices for efficient deployment, monitoring, and management of machine learning models. - Collaborate with business stakeholders to understand data requirements and translate them into effective analytics solutions. - Enable data visualization and business intelligence using tools such as Power BI. - Ensure data security, compliance, and governance within Azure and associated technologies. - Provide technical leadership and mentorship to the data engineering and analytics teams. - Stay up to date with the latest advancements in data engineering, cloud analytics, and AI/ML technologies. - Drive data modeling efforts and ensure optimal database structures for analytics and reporting. - Collaborate with Data Science teams to integrate AI/ML/GenAI solutions into the data ecosystem. - Ensure data quality, integrity, and reliability throughout the data lifecycle. - Engage with cross-functional teams to understand business needs and communicate complex data insights to non-technical stakeholders. - Design cost-efficient data architectures and optimize cloud costs. - Identify opportunities for process improvements and implement best practices in data analytics. - Stay abreast of industry trends and advancements in data analytics. - Promote a culture of continuous learning and development within the team. Requirements - Bachelor's degree in Business Administration, Information Technology, Data Science, or a related field. Master's degree is a plus. - 10-14 years of experience in Data Engineering, Analytics, Visualization, AI/ML. - Hands-on expertise in Databricks and Microsoft Azure ecosystem. - Strong knowledge of MLOps frameworks and best practices. - Proficiency in Python, SQL, and Spark for data processing and analysis. - Deep understanding of data pipelines, ETL processes, and cloud-based data lake solutions. - Experience in developing and deploying AI/ML models. - Expertise in data governance, security, and compliance within cloud environments. - Experience with Power BI and other visualization tools. - Excellent communication and stakeholder management skills. - Domain experience in Manufacturing is preferred. - Strong analytical and problem-solving skills with attention to detail. Preferred Skills - Experience in the manufacturing industry. - Familiarity with machine learning and advanced analytics techniques. - Familiarity with Python for data analysis & automation is an advantage. Job Title SBAPL_Solution Architect,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will be joining Genpact as a Lead Consultant, Salesforce Developer- Data Cloud, where your primary responsibility will be to design, develop, and implement solutions using Data Cloud and Salesforce OMS. Your role will involve a strong understanding of Agentforce and hands-on development experience with Data Cloud. Your key responsibilities will include: Data Cloud Development: - Designing and implementing data pipelines for data ingestion, transformation, and loading into Data Cloud. - Developing data models and flows to enable advanced analytics and insights. - Creating data visualizations and dashboards to communicate data-driven insights. - Integrating machine learning and AI models into Data Cloud for enhanced data analysis. Agentforce Development (good To Have): - Designing, developing, and deploying Agentforce agents to automate tasks and improve customer service efficiency. - Writing complex Prompt Builder steps and flows. - Implementing complex Agentforce orchestration flows for automating processes. - Integrating Agentforce with Data Cloud, OMS, Service Cloud, Experience Cloud, and other relevant systems. - Training and supporting users on Agentforce best practices. - Optimizing Agentforce agents for performance and scalability. - Monitoring Agentforce agents for errors and performance issues and implementing corrective actions. Minimum Qualifications: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Proven experience in cloud data engineering or a similar role. - Strong knowledge of cloud platforms like AWS, Azure, or Google Cloud. - Proficiency in programming languages such as Python, Java, or Scala. - Experience with data modeling, ETL processes, and data warehousing. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills. Location: India-Gurugram Schedule: Full-time Education Level: Bachelor's / Graduation / Equivalent Job Posting Date: Mar 13, 2025, 10:42:34 AM Master Skills List: Digital Job Category: Full Time Join us at Genpact and be part of a global professional services and solutions firm dedicated to shaping the future and creating lasting value for clients.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a talented and experienced Senior Data Modeler who will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. Your role will involve collaborating with cross-functional teams, including data analysts, architects, and business stakeholders, to ensure that the data models align with business requirements and drive efficient data management. Your key responsibilities will include designing, implementing, and maintaining data models that support business requirements, collaborating with various teams to align data models with business needs, leveraging expertise in Azure, Databricks, and data warehousing, managing and optimizing relational and NoSQL databases, contributing to ETL processes and data integration pipelines, applying data modeling principles and techniques, staying updated with industry trends and emerging technologies, and driving the adoption of best practices for data modeling within the organization. To excel in this role, you should have a minimum of 6+ years of experience in data modeling, expertise in Azure and Databricks, proficiency in data modeling tools like ER/Studio and Hackolade, a strong understanding of data modeling principles and techniques, experience with relational and NoSQL databases, knowledge of data warehousing, ETL processes, and data integration, familiarity with big data technologies like Hadoop and Spark, and excellent analytical and problem-solving skills. Industry knowledge in supply chain is preferred but not mandatory. Additionally, you should have strong communication skills to interact effectively with technical and non-technical stakeholders and the ability to work well in a collaborative, fast-paced environment.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Qualcomm India Private Limited is seeking a skilled professional with over 10 years of experience in testing, particularly in data engineering. The ideal candidate will have strong coding skills in languages like Python or Java, proficiency in SQL and NoSQL databases, and hands-on experience in data engineering, ETL processes, and data warehousing QA activities. In this role, you will be responsible for designing and developing automated test frameworks for data pipelines and ETL processes, utilizing tools such as Selenium, Jenkins, and Python to automate test execution. Candidates should have experience working with cloud platforms such as AWS, Azure, or Google Cloud, and be familiar with data technologies like Data Bricks, Hadoop, PySpark, and Kafka. An understanding of CI/CD pipelines, DevOps practices, containerization technologies like Docker and Kubernetes, performance testing, monitoring tools, and version control systems like Git is essential. Exposure to Agile and DevOps methodologies is also required. As a Systems Analyst at Qualcomm, you will manage project priorities, deadlines, and deliverables with minimal supervision. You will be expected to determine important work tasks, avoid distractions, and independently address setbacks in a timely manner. Additionally, you will serve as a technical lead on subsystems or small features, assign work to project teams, and work on advanced tasks to complete projects. Communication skills are crucial as you will collaborate with project leads, make recommendations, and adapt to changes to meet deadlines effectively. The successful candidate will have the opportunity to manage projects of small to medium size and complexity, apply expertise in subject areas to meet deadlines, identify test scenarios, oversee test execution, and provide QA results to the business. You will also assist in troubleshooting complex issues related to bugs in production systems, mentor team members, disseminate subject matter knowledge, and train business users on tools. A Bachelor's degree with 4+ years of IT-relevant work experience or 6+ years of IT-relevant work experience without a Bachelor's degree is required. Strong candidates will have 6-8 years of proven experience in testing, particularly in data engineering. Preferred qualifications include over 10 years of QA/testing experience, strong coding skills, and proficiency in SQL and NoSQL databases. Qualcomm is an equal opportunity employer committed to providing accessible processes and workplace accommodations for individuals with disabilities. If you require accommodations during the application/hiring process, please contact Qualcomm directly. Please note that Qualcomm expects its employees to adhere to all applicable policies and procedures, including security requirements for protecting confidential information. Staffing and recruiting agencies are not authorized to use Qualcomm's Careers Site for submissions, and unsolicited resumes will not be accepted. For more information about this role, please contact Qualcomm Careers.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
punjab
On-site
As the Chief AI Officer (CAIO) at RChilli in Mohali, you will play a pivotal role in driving the AI strategy, innovation, and ethical AI deployment in HR Tech. With over 10 years of experience in AI/ML, including leadership roles, you will lead the development and execution of RChilli's AI strategy aligned with business goals. Your responsibilities will include ensuring ethical AI implementation, implementing AI for automated job descriptions, resume scoring, and candidate recommendations, and overseeing AI-powered chatbots and workforce planning. You will be responsible for leading AI research & development in areas such as NLP, machine learning, and predictive analytics. Leveraging Generative AI for automating job descriptions, Conversational AI for chatbots, and Transformative AI for workforce planning will be crucial aspects of your role. Additionally, you will identify opportunities for AI implementation, assess third-party AI tools, and collaborate with cross-functional teams to integrate AI into HRTech solutions. Your technical expertise in NLP, machine learning, deep learning, and predictive analytics, along with leadership skills in aligning AI innovation with business goals, will be essential. You will work towards building and mentoring an AI team, promoting AI literacy across the organization, and representing RChilli in AI forums and industry partnerships. Joining RChilli will provide you with the opportunity to lead AI innovation, drive impactful work in HR operations and talent acquisition, and enjoy competitive compensation and career growth opportunities. If you are a visionary AI leader ready to shape the future of HRTech, RChilli welcomes you to join as our Chief AI Officer.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
kolkata, west bengal
On-site
As a Lead Data Analyst at Next Gear India Private Limited, a subsidiary of Cotality, you will be a key member of the Analytics team serving customers in the property insurance and restoration industries. In this role, you will be responsible for developing methods and models that drive data-driven decision processes, ultimately enhancing business performance for internal and external stakeholder groups. Your expertise in interpreting complex data sets and providing valuable insights will be crucial in optimizing data assets to deliver impactful results. Joining our team offers you an exciting opportunity to work in a dynamic environment where you will collaborate with cross-functional teams to support decision processes that shape the future of the property insights and analytics industry. Your role will involve leading a team of analysts to deliver client deliverables efficiently, acting as the domain expert to internal stakeholders throughout the analytics development process, and ensuring the timely delivery of assets to customers. Key Responsibilities: - Collaborate with cross-functional teams to understand and document requirements for analytics products. - Serve as the primary point of contact for new data/analytics requests and customer support. - Lead a team of analysts to ensure timely delivery of client deliverables. - Act as the voice of the customer to internal stakeholders during the analytics development process. - Develop and maintain an inventory of data, reporting, and analytic product deliverables. - Work with customer success teams to manage customer expectations for analytics deliverables. - Create and manage tickets within internal frameworks on behalf of customers. - Develop and optimize complex SQL queries for data extraction, transformation, and aggregation. - Create and maintain data models, dashboards, and reports to visualize data and track key performance metrics. - Collaborate closely with stakeholders to align project goals with business needs and provide actionable recommendations through ad-hoc analysis. - Analyze large and complex datasets to identify trends, patterns, and insights, presenting findings to stakeholders in a clear and concise manner. Job Qualifications: - 7+ years of property insurance experience preferred. - 5+ years of experience in managing mid-level professional teams with a focus on data and/or performance management. - Bachelor's degree in computer science, data science, statistics, or a related field preferred. - Mastery level knowledge of data analysis tools such as Excel, Tableau, or Power BI. - Proficiency in SQL with the ability to write complex queries and optimize performance. - Strong analytical and problem-solving skills. - Excellent attention to detail and the ability to work with large datasets. - Effective communication skills, both written and verbal. - Ability to work independently and collaborate in a team environment. Join us at Next Gear India Private Limited and be part of a global team that is shaping the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights.,
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
About The Role Location: Bangalore Our growth plans: We process close to 4% of the country&aposs freight on our software platform. Our goal is to get to 20% of the country&aposs freight by 2028. This gives us a birds eye view of the market. Were already the largest road freight technology platform in the country and we plan to build on this base to drive growth in software, freight marketplace and supply chain financing to get to a 100M USD revenue by 2028. About the Role: We are seeking a highly skilled Senior Data Engineer with 5-6 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies. Key responsibilities: Design, develop, and maintain robust data warehouse solutions to support the organization&aposs analytical and reporting needs. Implement efficient data modeling techniques to optimize performance and scalability of data systems. Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets. Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse. Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time. Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability. Design and develop batch pipelines for scheduled data processing tasks. Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions. Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks. Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives. Minimum Qualifications Bachelors degree in Computer Science, Information Technology, or related field. 12 years of hands-on professional experience working with data engineering or in a related role. Strong proficiency in SQL for data extraction, transformation, and optimization. Strong hands-on experience with Python for data processing, automation, and scripting. Familiarity with PySpark or other big data frameworks. Familiarity of Spark architecture (distributed processing) and distributed systems. Understanding of data structures, algorithms, and database concepts. Knowledge of coding best practices and code quality standards. Strong analytical and problem-solving skills. Eagerness to learn new technologies and work in a fast-paced, collaborative environment. Preferred Qualifications Solid working knowledge of PySpark for handling large-scale distributed datasets. In depth understanding of Spark architecture (executors, partitions, shuffles, caching, performance tuning). Exposure to data modeling, ETL pipelines, and data quality best practices. Familiarity with cloud data platforms (e.g., Databricks and AWS). Strong problem-solving and debugging skills with an eye for performance optimization. Ability to work collaboratively in agile teams and communicate technical concepts effectively. Educational background: B.Tech/BE in Computer Science, Information Technology, or related field from IITs, NITs, or IIITs. Certifications (preferred): Databricks Certified Data Engineer Associate/Professional, AWS Big Data Specialty, or equivalent. Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |