Jobs
Interviews

8555 Data Modeling Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

chandigarh

On-site

As a Senior Data Engineer, you will play a crucial role in supporting the Global BI team for Isolation Valves as they transition to Microsoft Fabric. Your primary responsibilities will involve data gathering, modeling, integration, and database design to facilitate efficient data management. You will be tasked with developing and optimizing scalable data models to cater to analytical and reporting needs, utilizing Microsoft Fabric and Azure technologies for high-performance data processing. Your duties will include collaborating with cross-functional teams such as data analysts, data scientists, and business collaborators to comprehend their data requirements and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to back Power BI and automation initiatives. Additionally, your expertise in data modeling, particularly in data warehouse and lakehouse design, will be essential in designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Furthermore, you will be responsible for developing ETL processes using tools like SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar platforms to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also fall under your purview. You will supervise and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Your role will require a strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms, along with experience in data integration and ETL tools like Azure Data Factory. A deep understanding of Microsoft Fabric or similar data platforms, as well as comprehensive knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions, will be necessary. Effective communication skills to convey technical concepts to both technical and non-technical stakeholders, the ability to work both independently and within a team environment, and the willingness to stay abreast of new technologies and business areas are also vital for success in this role. To excel in this position, you should possess 5-7 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities to tackle complex data challenges, and proficiency in database management, SQL query optimization, and data mapping. A solid grasp of Excel, including formulas, filters, macros, pivots, and related operations, is essential. Proficiency in Python and SQL/Advanced SQL for data transformations/Debugging, along with a willingness to work flexible hours based on project requirements, is also required. Furthermore, hands-on experience with Fabric components such as Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Models, as well as advanced SQL skills and experience with complex queries, data modeling, and performance tuning, are highly desired. Prior exposure to implementing Medallion Architecture for data processing, experience in a manufacturing environment, and familiarity with Oracle, SAP, or other ERP systems will be advantageous. A Bachelor's degree or equivalent experience in a Science-related field, with good interpersonal skills in English (spoken and written) and Agile certification, will set you apart as a strong candidate for this role. At Emerson, we are committed to fostering a workplace where every employee is valued, respected, and empowered to grow. Our culture encourages innovation, collaboration, and diverse perspectives, recognizing that great ideas stem from great teams. We invest in your ongoing career development, offering mentorship, training, and leadership opportunities to ensure your success and make a lasting impact. Employee wellbeing is a priority for us, and we provide competitive benefits plans, medical insurance options, Employee Assistance Program, flexible time off, and other supportive resources to help you thrive. Emerson is a global leader in automation technology and software, dedicated to helping customers in critical industries operate more sustainably and efficiently. Our commitment to our people, communities, and the planet drives us to create positive impacts through innovation, collaboration, and diversity. If you seek an environment where you can contribute to meaningful work, develop your skills, and make a difference, join us at Emerson. Let's go together towards a brighter future.,

Posted 4 days ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

As a Senior Business Intelligence Expert, you will leverage your extensive experience in PowerBI development to create high-performance and visually compelling business intelligence solutions. Your expertise in semantic modeling, data pipeline development, and API integration will play a crucial role in transforming complex data into actionable insights through intuitive dashboards that adhere to consistent branding guidelines and utilize advanced visualizations. You will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive key business decisions throughout the organization. Your proficiency in data modeling, ETL processes, and visualization best practices will be essential in delivering top-notch BI assets that meet performance standards and offer exceptional user experiences. Key Responsibilities: - Lead optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. - Enhance BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. - Develop and maintain complex data models using PowerBI's semantic modeling capabilities for data accuracy, consistency, and usability. - Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. - Design and implement automated processes for integrating data from various API sources. - Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. - Provide technical leadership and mentoring to junior BI developers. - Document technical specifications, data dictionaries, and user guides for all BI solutions. Required Qualifications: - Minimum 15+ years of experience in business intelligence, data analytics, or related field. - Good experience in Databricks. - Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. - Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. - Strong expertise in semantic modeling principles and best practices. - Extensive experience with custom visualizations and complex dashboard design. - Proficient in SQL for data manipulation and optimization. - Experience with Python for data processing and ETL workflows. - Proven track record of API integration and data ingestion from diverse sources. - Strong understanding of data warehouse concepts and dimensional modeling. - Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). Nice to Have Skills: - Experience implementing AI-powered analytics tools and integrating them with PowerBI. - Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. - Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). - Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. - Knowledge of writeback architecture and implementation in PowerBI solutions. - Experience with PowerBI APIs for custom application integration and automation. - Familiarity with DevOps practices for BI development and deployment. - Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an exciting opportunity to work with cutting-edge business intelligence technologies and deliver impactful solutions that drive organizational success through data-driven insights.,

Posted 4 days ago

Apply

2.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are an experienced Data Quality Analyst with 8-12 years of experience in the field. Your role will involve supporting the product team by analyzing large volumes of pre-processed and enriched third-party data to provide valuable insights. Your responsibilities will include interpreting extensive data sets using statistical techniques, generating reports on the impact of strategies, identifying trends or issues for senior management, and testing the effectiveness of different actions using models and data mining methods. You should be proficient in using various data analysis tools, building and implementing models, creating algorithms, and running simulations. Additionally, you will collaborate with multiple stakeholders and functional teams to improve business outcomes by uncovering solutions within large data sets. Key Responsibilities: - Evaluate the efficiency and accuracy of new data sources and gathering methods. - Collaborate with diverse functional teams to deploy models and monitor results. - Develop processes and tools to assess model performance, coverage, and data accuracy. Qualifications: - Minimum 7 years of Quality Assurance experience. - 5+ years of database querying experience (MYSQL, Oracle) and proficiency in statistical computer languages (R, Python) for data manipulation and insights. - Familiarity with Linux-based operating systems and exposure to Docker containers. - 3+ years of experience in a cloud-based environment (e.g., AWS). - 2+ years of expertise in data visualization tools such as Tableau/Kibana. - Knowledge of machine learning techniques like clustering, decision tree learning, artificial neural networks, etc., will be advantageous. - Strong written and verbal communication skills for effective team coordination. - Enthusiasm for learning new technologies and methodologies. - Experience in analyzing data from 3rd party providers (e.g., Google Analytics, Site Catalyst) and presenting insights to stakeholders.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an Engineering Manager - Analytics & Data Engineering at Highspot, you will be responsible for leading and nurturing a team of data analysts, data scientists, and data engineers. Your primary objective will be to drive the development of cutting-edge analytics capabilities within our B2B SaaS product, while also maintaining a company-wide data warehouse to facilitate data-driven decision-making. By leveraging your expertise in statistical analysis, machine learning techniques, and business acumen, you will uncover valuable insights from our data, enabling impactful decisions and fueling our growth. Your role will involve guiding strategic decisions related to data systems, analytics capabilities, team operations, and engineering culture. We are looking for a candidate who is passionate about team building and scaling, values-driven, committed to fostering a positive culture, self-directed, inquisitive, and resourceful. Your key responsibilities will include: - Leading a team of data analysts, data scientists, and data engineers, motivating them to deliver their best work and providing hands-on support when needed. - Analyzing core business topics using Highspot's product and business data to derive insights that drive product development and enhance platform effectiveness. - Applying statistical analysis, machine learning, and operations research techniques to develop solutions that drive impactful business outcomes such as operational efficiency improvements, customer churn rate reduction, and resource allocation optimization. - Driving the team's data & analytics strategy, technical roadmap, and data storage solutions. - Defining top-level business, team, and product metrics, and creating automated reports/dashboards to support strategic decision-making. - Developing and maintaining scalable end-to-end data pipelines and data warehouse systems that are essential for various teams across the company and ensure compliance with global data protection requirements. - Leading the development of custom scorecards and visualizations in the product to provide actionable insights to customers. - Contributing your technical expertise to the evolution of Highspot's software architecture and stack to meet the demands of hyper-growth and ensure high-availability and reliability across multiple data centers. - Collaborating with key partners and stakeholders to deliver high-impact customer value and promote effective communication within and outside the team. To be considered for this role, you should possess: - A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5+ years of experience in designing and building scalable, high-quality customer-facing software. - 5+ years of experience in advanced analytics and cloud data engineering. - Proficiency in statistical analysis, data science models, data pipelines, and deriving actionable insights from complex datasets. - Strong skills in SQL, Python, object-oriented programming, and web technologies. - Experience in presenting to C-level executives and collaborating with various business functions. - A track record of fostering a high-performing team and promoting a positive work culture. - An entrepreneurial spirit and a commitment to delivering high-quality results. At Highspot, we are committed to diversity and inclusion. If this role aligns with your skills and interests, we encourage you to apply, even if you do not meet all the requirements listed above.,

Posted 4 days ago

Apply

13.0 - 17.0 years

0 Lacs

maharashtra

On-site

Birlasoft is a powerhouse that brings together domain expertise, enterprise solutions, and digital technologies to redefine business processes. With a consultative and design thinking approach, we drive societal progress by enabling our customers to run businesses with efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, we have a team of 12,500+ professionals dedicated to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our commitment to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. As an Azure Tech PM at Birlasoft, you will be responsible for leading and delivering complex data analytics projects. With 13-15 years of experience, you will play a critical role in overseeing the planning, execution, and successful delivery of data analytics initiatives, while managing a team of 15+ skilled resources. You should have exceptional communication skills, a deep understanding of Agile methodologies, and a strong background in managing cross-functional teams in data analytics projects. Key Responsibilities: - Lead end-to-end planning, coordination, and execution of data analytics projects, ensuring adherence to project scope, timelines, and quality standards. - Guide the team in defining project requirements, objectives, and success criteria using your extensive experience in data analytics. - Apply Agile methodologies to create and maintain detailed project plans, sprint schedules, and resource allocation for efficient project delivery. - Manage a team of 15+ technical resources, fostering collaboration and a culture of continuous improvement. - Collaborate closely with cross-functional stakeholders to align project goals with business objectives. - Monitor project progress, identify risks, issues, and bottlenecks, and implement mitigation strategies. - Provide regular project updates to executive leadership, stakeholders, and project teams using excellent communication skills. - Facilitate daily stand-ups, sprint planning, backlog grooming, and retrospective meetings to promote transparency and efficiency. - Drive the implementation of best practices for data analytics, ensuring data quality, accuracy, and compliance with industry standards. - Act as a point of escalation for project-related challenges and work with the team to resolve issues promptly. - Collaborate with cross-functional teams to ensure successful project delivery, including testing, deployment, and documentation. - Provide input to project estimation, resource planning, and risk management activities. Mandatory Experience: - Technical Project Manager experience of minimum 5+ years in Data lake and Data warehousing (DW). - Strong understanding of DW process execution from acquiring data to visualization. - Exposure to Azure skills such as Azure ADF, Azure Databricks, Synapse, SQL, PowerBI for minimum 3+ years or experience in managing at least 2 end-to-end Azure Cloud projects. Other Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or related field. - 13-15 years of progressive experience in technical project management focusing on data analytics and data-driven initiatives. - In-depth knowledge of data analytics concepts, tools, and technologies. - Exceptional leadership, team management, interpersonal, and communication skills. - Demonstrated success in delivering data analytics projects on time, within scope, and meeting quality expectations. - Strong problem-solving skills and proactive attitude towards identifying challenges. - Project management certifications such as PMP, PMI-ACP, CSM would be an added advantage. - Ability to thrive in a dynamic and fast-paced environment, managing multiple projects simultaneously.,

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a SalesForce Business Analyst at Innova ESI in Chennai, focusing on Salesforce.com Administration, Business Analysis, and optimizing Business Processes to drive digital transformation and enhance business operations. Your role will involve gathering Business Requirements, creating data flow diagrams, facilitating Business Process Optimization/Transformation, supporting the design of training materials, conducting internal training sessions for business users, and ensuring a smooth transition to newly implemented Salesforce processes. Key responsibilities include advising on the best use of Salesforce features, emphasizing out-of-the-box functionality, configuration building on the Salesforce platform, eliciting requirements under Agile and Waterfall methods, preparing functional prototypes and wireframes, collaborating with delivery teams, engaging with business product owners, and translating technical terms into business-friendly language. You should have a strong understanding of the Salesforce Platform, exposure to Banking & Finance domain, excellent communication skills, ability to analyze & design data models, user interfaces, business logic, and security for custom applications. Hands-on experience with Sales/Service Cloud, Salesforce Administrator Certification, Technology Change Management expertise, and familiarity with Agile/Waterfall projects are essential. Desirable skills include organizational skills, Salesforce certifications, and previous project experience in Salesforce. You must have at least 6+ years of relevant experience, and the role is based in Pune, Chennai, or Bangalore. Immediate joiners are preferred. If you meet the requirements, please share your resume with kanchan.arya@innovaesi.com for consideration.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migration strategies from on-premises or other cloud environments to GCP. Optimizing data pipelines and query performance to facilitate efficient data processing and analysis will be a key focus area. Additionally, your proven experience in managing teams and project delivery will be essential for success in this position. Collaborating closely with stakeholders to comprehend their requirements and deliver effective solutions will be a significant part of your responsibilities. Any experience with Looker will be considered advantageous for this role.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

bhubaneswar

On-site

As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your role will involve collaborating with the team to ensure project progress and providing solutions that align with business needs and application specifications. You are expected to be a subject matter expert (SME) and lead the team in implementing innovative solutions. Key Responsibilities include: - Collaborating and managing the team to perform effectively - Making team decisions and contributing to key decisions across multiple teams - Providing solutions to problems within your team and across various teams - Conducting regular team meetings to ensure project progress - Staying updated on industry trends and technologies Professional & Technical Skills Required: - Proficiency in Stibo Product Master Data Management - Strong understanding of data modeling and data architecture - Experience in data integration and data migration - Hands-on experience in application development and customization - Knowledge of data governance and data quality management Minimum 7.5 years of experience in the field is required, along with a 15 years full-time educational qualification. This position is based in Bhubaneswar.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Market Analyst, you will play a crucial role in supporting business growth by providing insightful market and competitive analysis. Your primary responsibilities will include conducting in-depth market research, analyzing industry developments, monitoring competitor activities, and gathering data on customer needs and product performance. You will collaborate with cross-functional teams to offer data-driven recommendations and support strategic decision-making. Additionally, you will be expected to develop market forecasts, possess strong analytical skills, and have proficiency in data analysis tools and software. To qualify for this position, you should hold a Bachelor's degree in Marketing, Economics, Business Analytics, or a related field, with a Master's degree considered a plus. You must have 3-5 years of experience in market research or business analysis, particularly in international markets such as Europe, UK, USA, and/or Gulf regions. Strong analytical and critical thinking skills, proficiency in data analysis tools, excellent research abilities, and effective communication skills are essential for this role. Knowledge of statistical methods, data modeling, and experience with CRM and market intelligence platforms will be advantageous. Additionally, the ability to work independently, manage multiple projects, and multilingual abilities will be beneficial in this position.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 - 0 Lacs

karnataka

On-site

Overview of the Company: 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, the company helps the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, embracing challenges and winning together are core values that guide the company towards achieving its goals and supporting its people. The company is dedicated to creating a significant impact for its employees by fostering a culture that sparks innovation and supports professional and personal growth. Overview of the Role: We are seeking an experienced Data Architect to design, develop, and maintain the Google Cloud data architecture for 66degrees. The ideal candidate will have a strong background in data architecture, data engineering, and cloud technologies, with specific experience in managing data across Google Cloud platforms. Responsibilities: - GCP Cloud Architecture: Design, implement, and manage robust, scalable, and cost-effective cloud-based data architectures on Google Cloud Platform (GCP), utilizing services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Experience in designing cloud architectures on Oracle Cloud is a plus. - Data Modeling: Develop and maintain conceptual, logical, and physical data models to support various business needs. - Big Data Processing: Design and implement solutions for processing large datasets using technologies such as Spark and Hadoop. - Data Governance: Establish and enforce data governance policies, including data quality, security, compliance, and metadata management. - Data Pipelines: Build and optimize data pipelines for efficient data ingestion, transformation, and loading. - Performance Optimization: Monitor and tune data systems to ensure high performance and availability. - Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and provide architectural guidance. - Innovation: Stay current with the latest technologies and trends in data architecture and cloud computing. Qualifications: - GCP Core Services: In-depth knowledge of GCP data services, including BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. - Data Modeling: Expertise in data modeling techniques and best practices. - Big Data Technologies: Hands-on experience with Spark and Hadoop. - Cloud Architecture: Proven ability to design scalable, reliable, and cost-effective cloud architectures. - Data Governance: Understanding of data quality, security, compliance, and metadata management. - Programming: Proficiency in SQL, Python, and DBT (Data Build Tool). - Problem-Solving: Strong analytical and problem-solving skills. - Communication: Excellent written and verbal communication skills. - A Bachelor's degree in Computer Science, Computer Engineering, Data, or related field, or equivalent work experience required. - GCP Professional Data Engineer or Cloud Architect certification is a plus.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a QlikView Administrator at Strawberry InfoTech in Gurugram, you will be responsible for installing, configuring, and maintaining QlikView servers, developing and supporting QlikView applications, troubleshooting issues, and ensuring system performance and availability. You will provide 1st and 2nd level support for Qlik applications, including QlikView, Qlik Sense, and Qlik NPrinting. Your responsibilities will include troubleshooting and resolving issues related to data loads, application performance, and user access, as well as monitoring application performance, data load processes, and system health. You will assist end-users with navigation, report generation, and application functionalities, track and document issues, resolutions, and changes, and escalate complex issues to higher levels as needed. Additionally, you will ensure data accuracy and integrity by validating data loads and performing necessary checks, collaborate with developers, data engineers, and business analysts to implement fixes and enhancements, support the implementation of system upgrades, patches, and enhancements, maintain up-to-date support and process documentation, user guides, and best practices, and provide training and guidance to end-users on how to effectively use Qlik applications. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of experience as a QlikView Administrator. You should possess proficiency in QlikView, Qlik Sense, and Qlik NPrinting, a good understanding of data modeling and ETL processes in Qlik, familiarity with SQL and data visualization concepts, and experience in troubleshooting application and data-related issues. Strong analytical and problem-solving skills, excellent communication and interpersonal skills, and the ability to work independently and in a team environment are essential for this role. Qlik certifications (e.g., QlikView Developer, Qlik Sense Data Architect) are considered a plus. If you are interested in this opportunity, please share your updated resume at deepak.k@strawberryinfotech.com.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

Candidate should possess a bachelor's or equivalent degree along with a minimum of 2 years of experience. The ideal candidate must have hands-on experience in SAP MDG projects, specifically in performing MDG Configurations and Customizations such as Data Modeling, UI Modeling, Process Modeling, Data Replication Framework, Key Mapping, rules, and derivations, BRF+. Expertise in SAP MDG configurations, replication configurations, and technical knowledge of MDG workflows and ERP tables is a must. The candidate should have experience in full life cycle implementation including blueprinting, fit gap analysis, configurations, data migrations, cutovers, and go-live experience. Additionally, proficiency in customization for MDG workflows, Floor Plan Manager, WDABAP, BRF+, and hands-on development in BADI ABAP is required. Knowledge of WDABAP framework-based customizations, OOPs programming in SAP MDG, FPM, and UI customizations is highly desirable. The candidate should be able to collaborate with technical teams to complete the SAP MDG implementation for Material, Customer, Supplier, and FI objects. As part of the Infosys consulting team, the role will involve actively supporting the consulting team in various project phases, including problem definition, effort estimation, diagnosis, solution generation, design, and deployment. Responsibilities will also include researching and exploring alternatives to recommended solutions, creating requirement specifications from business needs, defining to-be-processes, and detailed functional designs based on requirements. The candidate will assist in configuring solution requirements, diagnosing issues, seeking clarifications, identifying solution alternatives, and contributing to unit-level and organizational initiatives to provide high-quality value-adding solutions to customers. The role requires collaborating with clients to identify business challenges, refining, analyzing, and structuring relevant data, staying updated on the latest technologies and trends, logical thinking, problem-solving skills, and the ability to assess current processes, identify improvement areas, and suggest technology solutions. One or two industry domain knowledge is preferred. Location of posting - Infosys Ltd. currently has open positions in multiple locations across India, including Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Vizag, Kolkata, Mysore, and Hubli. While the posting location is subject to business requirements, efforts will be made to offer the location of choice wherever possible.,

Posted 4 days ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Gurugram

Work from Office

Min 3-5 years of AWS ETL development experience. Must have experience on AWS cloud, EC2, IAM, KMS keys, AWS Lambda, Batch, Terraform/CFT, Eventbridge, Managed Kafka, Kinesis, Glue, PySpark. Understanding of data modelling concepts Required Candidate profile Skills- JAVA(1.8 and above) and AWS (Good to have MuleSoft) Knowledge of Python and other programming languages Call Vikas 8527840989 Email vikasimaginators@gmail.com

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

delhi

On-site

As a Data Engineer at SaveLIFE Foundation, you will be a crucial part of the team responsible for creating, developing, and maintaining data-driven solutions that align with our goal of utilizing technology and evidence-based interventions to save lives. This role is suited for individuals who appreciate the dynamics of small organizations, are adept at handling multiple tasks, and thrive in a flexible work environment. Working within a compact team, you will be expected to take on various responsibilities, encourage innovation, and have the chance to create a substantial, tangible difference! Your main duties will involve extensive coding using Python and PHP to automate and enhance data workflows and applications. You will also be tasked with the development and upkeep of PHP-based applications that seamlessly integrate with data pipelines and backend systems. Additionally, you will be responsible for organizing, cleaning, and converting data from diverse sources like SQL databases, APIs, and scraped content, ensuring a high degree of automation wherever feasible. Collaboration with stakeholders to grasp requirements, identify challenges, and devise efficient data solutions to improve decision-making will also be a key aspect of your role. You will contribute valuable insights by leveraging analytical tools and supporting data visualization initiatives. Furthermore, constructing and sustaining data pipelines to facilitate real-time and batch data processing, as well as integrating third-party APIs for data extraction and updates, will be essential tasks. Your involvement in working alongside cross-functional teams to test, deploy, and refine solutions within a production environment will also be crucial. In terms of technical skills, proficiency in Python, SQL, and PHP is a must. You should possess experience in building robust backend systems using PHP frameworks like Laravel, Symfony, or CodeIgniter, as well as in constructing and managing data pipelines in real-world settings. Proficiency in utilizing Python libraries such as requests, scrapy, pandas, and sqlalchemy for ETL (Extract, Transform, Load) operations is crucial. Experience in working with third-party APIs and integrating them into backend applications is also required. A strong understanding of relational databases and data modeling techniques is essential, along with familiarity with tools like Airflow, Kafka, and PySpark for data engineering workflows. Hands-on experience with cloud platforms like AWS for hosting and scaling solutions, as well as familiarity with DevOps tools like Docker and Terraform, is necessary. Proficiency in debugging and optimizing PHP-based systems and database interactions is also expected. Qualifications & Experience: - A Bachelor's degree in Computer Science, Data Science, or a related technical field. - At least 2 years of experience in data engineering, software development, or related roles. Preferred Skills: - Experience with R and R Shiny is considered advantageous. - Familiarity with PHP database interactions (PDO, MySQLi) and knowledge of security best practices is beneficial. Personal Characteristics: - Demonstrated ability to work both independently and collaboratively in a fast-paced, multicultural setting. - Strong communication skills, both verbal and written, in English. - A genuine passion for leveraging technology and evidence-informed solutions to enhance lives. - An open-minded self-starter who excels in tackling new, complex challenges. - A deep dedication to SaveLIFE Foundation's mission of creating a meaningful impact in underserved communities.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior ETL Developer in the Data Services Team, you will play a lead role in ETL design, data modeling, and ETL development. Your responsibilities will include facilitating best practice guidelines, providing technical leadership, working with stakeholders to translate requirements into solutions, gaining approval for designs and effort estimates, and documenting work via Functional and Tech Specs. You will also be involved in analyzing processes for gaps and weaknesses, preparing roadmaps and migration plans, and communicating progress using the Agile Methodology. To excel in this role, you should have at least 5 years of experience with Oracle, Data Warehousing, and Data Modeling. Additionally, you should have 4 years of experience with ODI or Informatica IDMC, 3 years of experience with Databricks Lakehouse and/or Delta tables, and 2 years of experience in designing, implementing, and supporting a Kimball method data warehouse on SQL Server or Oracle. Strong SQL skills, a background in Data Integration, Data Security, and Enterprise Data Warehouse development, as well as experience in Change Management, Release Management, and Source Code control practices are also required. The ideal candidate will have a high school diploma or equivalent, with a preference for a Bachelor of Arts or a Bachelor of Science degree in computer science, systems analysis, or a related area. If you are enthusiastic about leveraging your ETL expertise to drive digital modernization and enhance data services, we encourage you to apply for this role and be part of our dynamic team.,

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for an experienced and skilled Azure Data Engineer to join our team at Creant for a contract-based position in Pune. As an Azure Data Engineer, you will be responsible for designing, developing, and implementing data analytics and data warehouse solutions using Azure Data Platform. You will collaborate closely with business stakeholders, data architects, and technical teams to ensure efficient data integration, transformation, and availability. Your key responsibilities will include designing, developing, and implementing data warehouse and data analytics solutions leveraging Azure Data Platform. You will create and manage data pipelines using Azure Data Factory (ADF) and Azure Data Bricks, and work extensively with Azure AppInsights, Dataverse, and PowerCAT Tools to ensure efficient data processing and integration. Additionally, you will implement and manage data storage solutions using Azure SQL Database and other Azure data services. Designing and developing Logic Apps, Azure Function Apps for data processing, orchestration, and automation will also be part of your role. You will collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. Performing data validation, quality checks, and ensuring data consistency across systems will be essential. You will also be responsible for monitoring, troubleshooting, and optimizing data solutions for performance, scalability, and security, as well as preparing technical documentation and supporting project handover to operations teams. The primary skills required for this role include: - Strong experience as a Data Engineer with 6 to 10 years of relevant experience. - Expertise in Azure Data Engineering services such as Azure AppInsights, Dataverse, PowerCAT Tools, Azure Data Factory (ADF), Azure Data Bricks, Azure SQL Database, Azure Function Apps, and Azure Logic Apps. - Proficiency in ETL/ELT processes, data integration, and data migration. - Solid understanding of Data Warehouse Architecture and data modeling principles. - Experience in working on large-scale data platforms and handling complex data workflows. - Familiarity with Azure Analytics Services and related data tools. - Strong knowledge of SQL, Python, or Scala for data manipulation and processing. Preferred skills for this role include knowledge of Azure Synapse Analytics, Cosmos DB, and Azure Monitor, a good understanding of data governance, security, and compliance aspects, as well as strong problem-solving, troubleshooting, communication, and stakeholder management skills.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

As a skilled professional in ETL testing and data warehousing, your primary responsibility will be to design and execute test plans, test cases, and test scripts for ETL processes. You will be tasked with performing data validation and verification to ensure data integrity and accuracy. It will also be your duty to identify, document, and track defects and issues in the ETL processes, collaborating closely with data engineers and developers to troubleshoot and resolve data-related issues. Your role will also involve participating in requirement analysis and providing valuable feedback on data quality and testing requirements. Additionally, you will be expected to generate and maintain test documentation and reports to ensure comprehensive and accurate records. To excel in this position, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field. You should have 4-6 years of experience specifically in ETL testing and data warehousing, with a strong knowledge of ETL tools and processes. Proficiency in SQL and database management systems is essential, along with familiarity with data modeling and data architecture concepts. If you are passionate about ensuring the quality and accuracy of data through meticulous testing processes, and possess the relevant qualifications and experience, we encourage you to apply for this challenging and rewarding opportunity.,

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

About ACA: ACA was founded in 2002 by four former SEC regulators and one former state regulator. The founders recognized the necessity for investment advisers to receive expert guidance on existing and new regulations. Over time, ACA has experienced organic growth and expansion through acquisitions to enhance our GRC business and technology solutions. Our array of services now includes GIPS standards verification, cybersecurity and technology risk, regulatory technology, ESG advisory, AML and financial crimes, financial and regulatory reporting, and Mirabella for establishing EU operations. Position Summary: As an Enterprise Data Manager at ACA, your role involves overseeing the development and utilization of data systems. You will be tasked with finding effective ways to organize, store, and analyze data while prioritizing security and confidentiality. Your strategic planning and supervision will be instrumental in enhancing our operational efficiency and driving business growth in a data-driven environment. Job Duties: - Develop and execute data management strategies aligned with company objectives. - Supervise the collection, storage, management, quality, and security of data. - Ensure data accuracy, accessibility, and minimize redundancy. - Collaborate with IT teams and management to formulate a data strategy meeting industry requirements. - Lead and mentor a team of data professionals. - Contribute to the design and development of the core Enterprise Data Services hub. - Engage directly with internal business stakeholders to elicit requirements and align on solution options. Education, Experience, and Skills: - Bachelor's degree in Computer Science, Data Science, or related field with a minimum of seven (7) years of experience OR a minimum of ten (10) years of relevant industry experience (in the absence of a bachelor's degree). - Proficiency in data modeling practices with a background in building enterprise data solutions. - Experience in designing and constructing reporting and dashboarding solutions. - Familiarity with Microsoft Azure cloud environment and relevant data technologies such as Azure Data Factory, Event Hubs, Synapse, etc. - Eagerness to learn and experiment with new technologies to deliver top-notch solutions. - Strong communication skills to articulate technical topics to business stakeholders. - Experience in extracting and pushing data from diverse APIs, including REST. - Proficiency in utilizing Postman. - Ability to read and write SQL, stored procedures, and functions. - Demonstrated knowledge and experience with Azure Data Factory pipelines. - Proficiency in Azure data landscape. - Familiarity with DAX and M query formations. Preferred Education and Experience: - Hands-on experience with Power BI. - Experience in building SSAS cubes, particularly in tabular modeling. - Knowledge of C#, Python, or other object-oriented programming languages is a plus, currently utilized for writing Azure Functions as required. - Understanding of AI/ML to support long-term strategy. Required Skills and Attributes: - Proficient in Data Modeling. - Hands-on experience with Azure Cloud and pertinent data technologies. - Adept at building reports and dashboards, preferably using Power BI. - Ability to collaborate effectively with all levels of leadership and business partners. - Capable of managing multiple projects simultaneously and adapting swiftly to changing business needs. Why join our team Joining ACA means becoming part of the premier governance, risk, and compliance (GRC) advisor in financial services. Our team comprises former regulators, compliance professionals, legal experts, GIPS standards verifiers, cybersecurity specialists, ESG advisors, and regulatory technology practitioners. We foster an entrepreneurial work environment by providing innovative solutions tailored to our clients" needs. At ACA, we value creative thinking, offer diverse career paths, and emphasize continuous learning through inquiry, curiosity, and transparency. If you are ready to be a part of an award-winning, global team of dedicated and skilled professionals, ACA is the ideal place for you.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers, and consumers worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning, bold ideas, courage, and passion to drive life-changing impact to ZS. Our most valuable asset is our people. At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systems - the ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. **What you'll do:** We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of the project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g., agile, waterfall) and best practices (e.g., mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring a Consulting mindset in problem-solving, innovation by leveraging technical and business knowledge/expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programming languages, ETL service platform, etc. **What you'll bring:** - Bachelor's or master's degree in computer science, Engineering, or a related field. - 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. - 3+ years of experience in RDF ontologies, Data modeling & ontology development. - Strong expertise in python, pyspark, SQL. - Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. - Project management and task planning experience, ensuring smooth execution of deliverables and timelines. - Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. - Experience with automation testing. - Performance Optimization: Knowledge of techniques to optimize knowledge graph operations like data inserts. - Data Modeling: Proficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. - Motivation and willingness to learn new tools and technologies as per the team's requirements. **Additional Skills:** - Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations. - Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. - Experience in manufacturing data is a plus. - Capability to simplify complex concepts into easily understandable frameworks and presentations. - Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects. - Travel to other offices as required to collaborate with clients and internal project teams. **Perks & Benefits:** ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth, and professional development. Our robust skills development programs, multiple career progression options, and internal mobility paths and collaborative culture empower you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. **Travel:** Travel is a requirement at ZS for client-facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. **Considering applying ** At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. **To Complete Your Application:** Candidates must possess or be able to obtain work authorization for their intended country of employment. An online application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Migration and Data Management Specialist, your responsibilities will include: Migration Management: Lead the migration process from Informatica MDM to Ataccama MDM, developing comprehensive strategies, plans, and timelines to ensure a smooth transition. Ensure data accuracy, consistency, and completeness throughout the migration process. Data Quality and Integration: Manage ETL processes for extracting, transforming, and loading data into Ataccama MDM. Implement and uphold data quality rules and processes within Ataccama to maintain high data quality standards. Oversee API integrations with Ataccama to facilitate seamless data flow across systems. Collaboration and Coordination: Work closely with cross-functional teams to gather requirements and ensure alignment with business objectives. Provide training and support to team members on Ataccama MDM functionalities. Collaborate with IT and business units to troubleshoot and resolve any migration-related issues promptly. Documentation and Reporting: Thoroughly document migration processes, procedures, and best practices for future reference. Generate detailed reports on migration progress and data quality metrics to track performance. Offer recommendations for continuous improvement in data management practices. Certifications: A Bachelor's degree in Information Management, Computer Science, Data Science, or a related field is required. Ataccama MDM certification is preferred. Primary Skills: Demonstrated experience in successfully migrating from Informatica MDM to Ataccama MDM. Hands-on expertise in ETL processes, data quality management, and MDM operations. Proficiency in working with Ataccama MDM and related tools. Secondary Skills: Strong analytical and problem-solving abilities to address complex data management challenges. Meticulous attention to detail and accuracy in handling data processes. Proficiency in data modeling and database management techniques. Excellent communication and interpersonal skills for effective collaboration with team members. Additional Requirements: Prefer candidates holding Australian visas. Familiarity with industry standards and regulations pertaining to data management. Proficiency in SQL and data querying languages. Openness to learning and adopting new technologies, specifically Ataccama MDM.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Kinaxis Rapid Response Consultant, you will play a crucial role in implementing end-to-end supply chain planning solutions using the Kinaxis RR platform. Your responsibilities will include guiding solution configurations, troubleshooting technical and functional issues during implementation, and incorporating advanced technologies like analytics, AI, and machine learning to enhance supply chain operations and address complex challenges such as inventory optimization and S&OP simplification. You should have a minimum of 8 years of experience in Supply Chain Planning Solutions, with at least 4 years of specific experience in Kinaxis Rapid Response. Your expertise should encompass areas such as Demand Planning, Supply & Production Planning, Inventory optimization, Constrained Planning, and S&OP. Additionally, you should possess a deep understanding of Kinaxis Rapid Response data management, modeling, key control settings, and analytics, along with the ability to integrate legacy data and manage spreadsheets effectively. In this role, you will be responsible for maintaining Kinaxis Solutions & Environments, configuring Workbooks, Worksheets, Reports, Forms, and Scripts in Rapid Response based on the solution design requirements. Knowledge of Kinaxis RR integration with SAP and domain expertise in Retail/Wholesale Retail/Manufacturing will be advantageous. Your proficiency in business process knowledge and interpersonal skills will be essential for successful implementation of Kinaxis for new and existing businesses. Any prior experience with planning applications like SAP IBP, o9, Blue Yonder, or others will be considered a plus. A degree in Computer Science, Management Information Systems, Management Accounting, Business Administration, or a related field is required. Certifications such as Kinaxis Rapid Response Author Level 3 or Solution Consultant Level 3 are preferred for this role.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for a Data & Analytics Principal Consultant to join their team in Bangalore, Karnataka, India. As a part of this inclusive and forward-thinking organization, you will be responsible for hands-on development of GenAI solutions using the Microsoft Azure platform. You will lead the build team, maintain, deploy, and support the internal GenAI platform and integrate with LLM, Langchain, and OpenAI. Your duties will include designing and building advanced GenAI plugins and services to streamline workflows and tackle complex challenges. You will also conduct code and PR reviews, collaborate with DevOps and Cloud Engineering teams to enhance development processes, and ensure requirements are met in the platform build-out. Additionally, you will collaborate on the design and implementation of best practices for Vector DB architecture, stay updated on generative AI and LLMs, and produce detailed technical documentation for platform capabilities. The ideal candidate should have expertise in Microsoft Gen AI - copilot, Azure AI/ML, and a strong understanding of GenAI solutions development. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer, NTT DATA has diverse experts across more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence solutions, and the development and management of applications, infrastructure, and connectivity. Join NTT DATA to drive innovation and transformation for long-term success.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

You are a global climate technologies company engineered for sustainability, creating sustainable and efficient residential, commercial, and industrial spaces through HVACR technologies. Your focus includes protecting temperature-sensitive goods throughout the cold chain and providing comfort worldwide. By combining best-in-class engineering, design, and manufacturing with leading brands in compression, controls, software, and monitoring solutions, you develop next-generation climate technology tailored for future needs. Whether you are a professional seeking a career change, an undergraduate student exploring opportunities, or a recent graduate with an advanced degree, numerous opportunities await you to innovate, be challenged, and make a significant impact by joining the team and embarking on your journey today. In the realm of Software Development, you will be responsible for developing code and solutions that facilitate the transfer and transformation of data across various systems. Maintaining a deep technical knowledge of tools in the data warehouse, data hub, and analytical tools is crucial. Ensuring efficient data transformation and storage for retrieval and usage, as well as optimizing data systems" performance, are key tasks. Moreover, developing a profound understanding of underlying business systems related to analytical systems is essential. You will adhere to standard software development lifecycle, code control, code standards, and process standards, continually enhancing your technical knowledge through self-training, educational opportunities, and participation in professional organizations related to your tech skills. In Systems Analysis, your role involves collaborating with key stakeholders to comprehend business needs and capture functional and technical requirements. You will propose ideas to simplify solution designs and communicate expectations to stakeholders and resources during solution delivery. Developing and executing test plans to ensure the successful rollout of solutions, including data accuracy and quality, is part of your responsibilities. Regarding Service Management, effective communication with leaders and stakeholders to address obstacles during solution delivery is imperative. Defining and managing promised delivery dates, proactively researching, analyzing, and predicting operational issues, and offering viable options to resolve unexpected challenges during solution development and delivery are essential aspects of your role. Your education and job-related technical skills include a Bachelor's Degree in Computer Science/Information Technology or equivalent. You possess the ability to communicate effectively with individuals at all levels verbally and in writing, demonstrating a courteous, tactful, and professional approach. Working in a large, global corporate structure, having an advanced English level (additional language proficiency is advantageous), a strong sense of ethics and adherence to the company's core values, and willingness to travel domestically and internationally to support global implementations are required. You demonstrate the capability to clearly identify and define problems, assess alternative solutions, and make timely decisions. Your decision-making ability, operational efficiency in ambiguous situations, high analytical skills to evaluate approaches against objectives, and a minimum of three years of experience in a Data Engineer role with expertise in specific tools and technologies are essential. Your behavior and soft skills encompass proficiency in written technical concepts, leading problem-solving teams, conflict resolution efficiency, collaboration in cross-functional projects, and driving process mapping sessions. Additionally, the Korn Ferry Competencies you embody include customer focus, building networks, instilling trust, being tech-savvy, demonstrating interpersonal savvy, self-awareness, taking action, collaborating, and being a nimble learner. The company's commitment to its people is evident in its dedication to sustainability, reducing carbon emissions, and improving energy efficiency through groundbreaking innovations, HVACR technology, and cold chain solutions. The culture of passion, openness, and collaboration empowers employees to work towards a common goal of making the world a better place. Investing in the comprehensive development of individuals ensures personal and professional growth from onboarding through senior leadership. Flexible and competitive benefits plans cater to individual and family needs, offering various options for time off, including paid parental leave, vacation, and holiday leave. The commitment to Diversity, Equity & Inclusion at Copeland emphasizes the creation of a diverse, equitable, and inclusive environment essential for organizational success. A culture where every employee is welcomed, heard, respected, and valued for their experiences, ideas, perspectives, and expertise is fostered. Embracing diversity and inclusion drives innovation, enhances customer service, and creates a positive impact in the communities where the company operates. Copeland is an Equal Opportunity Employer, fostering an inclusive workplace where all individuals are valued and respected for their contributions and unique qualities.,

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies