Jobs
Interviews

5858 Data Warehousing Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Bridgnext, you will be responsible for working on internal and customer-based projects. Your primary focus will be on ensuring the quality of the code and providing optimal solutions to meet client requirements while anticipating their future needs based on market understanding. Your experience with Hadoop projects, including data processing and representation using various AWS services, will be valuable in this role. You should have at least 4 years of experience in data engineering, with a specialization in big data technologies such as Spark and Kafka. A minimum of 2 years of hands-on experience with Databricks is essential for this position. A strong understanding of data architecture, ETL processes, and data warehousing is necessary, along with proficiency in programming languages like Python or Java. Experience with cloud platforms such as AWS, Azure, and GCP, as well as familiarity with big data tools, will be beneficial. Excellent communication, interpersonal, and leadership skills are required to effectively collaborate with team members and clients. You should be able to work in a fast-paced environment, managing multiple priorities efficiently. In addition to technical skills, you should possess solid written, verbal, and presentation communication abilities. Being a strong team player while also capable of working independently is crucial. Maintaining composure in various situations, collaborative nature, high standards of professionalism, and consistently delivering high-quality results are expected from you. Your self-sufficiency and openness to creative solutions will be key in addressing any challenges that may arise in the role.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Business Intelligence Specialist at Adobe, you will have the opportunity to work closely with Business analysts to understand design specifications and translate requirements into technical models, dashboards, reports, and applications. Your role will involve collaborating with business users to cater to their ad-hoc requests and deliver scalable solutions on MSBI platforms. You will be responsible for system integration of data sources, creating technical documents, and ensuring data and code quality through standard methodologies and processes. To succeed in this role, you should have at least 3 years of experience in SSIS, SSAS, Data Warehousing, Data Analysis, and Business Intelligence. You should also possess advanced proficiency in Data Warehousing tools and technologies, including databases, SSIS, and SSAS, along with in-depth understanding of Data Warehousing principles and Dimensional Modeling techniques. Hands-on experience in ETL processes, database optimization, and query tuning is essential. Familiarity with cloud platforms such as Azure and AWS, as well as Python or PySpark and Databricks, would be beneficial. Experience in creating interactive dashboards using Power BI is an added advantage. In addition to technical skills, strong problem-solving and analytical abilities, quick learning capabilities, and excellent communication and presentation skills are important for this role. A Bachelor's degree in Computer Science, Information Technology, or an equivalent technical discipline is required. At Adobe, we value a free and open marketplace for all employees and provide internal growth opportunities for your career development. We encourage creativity, curiosity, and continuous learning as part of your career journey. To prepare for internal opportunities, update your Resume/CV and Workday profile, explore the Internal Mobility page on Inside Adobe, and check out tips to help you prep for interviews. The Talent Team will reach out to you within 2 weeks of applying for a role via Workday, and if you move forward in the interview process, inform your manager for support in your career growth. Join Adobe to work in an exceptional environment with colleagues committed to helping each other grow through ongoing feedback. If you are looking to make an impact and grow your career, Adobe is the place for you. Discover more about employee experiences on the Adobe Life blog and explore the meaningful benefits we offer. For any accommodation needs during the application process, please contact accommodations@adobe.com.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

chandigarh

On-site

As a Solution Architect, your primary responsibility will be to design and implement scalable data integration solutions using Oracle Data Integrator (ODI). You will utilize Python for advanced data transformation, automation, and orchestration tasks. It will be crucial for you to translate business requirements into comprehensive end-to-end data solutions, prioritizing performance, maintainability, and regulatory compliance. Collaboration with stakeholders from various teams such as data engineering, analytics, compliance, and business will be essential to define architecture standards and ensure alignment. In this role, you will lead technical design sessions, develop architecture documents, and provide mentorship to development teams on industry best practices. Ensuring that data governance, privacy, and security standards are integrated into the architecture will be a key focus. You will also drive the migration and modernization of legacy healthcare systems onto contemporary data platforms, whether on-premise or on the cloud. Troubleshooting and optimizing complex data pipelines and integration workflows will also be part of your responsibilities. To excel in this position, you should possess at least 8 years of experience in data architecture, data engineering, or related technical roles. A strong command of Oracle Data Integrator (ODI), particularly for enterprise-scale ETL/ELT workflows, is essential. Proficiency in Python for scripting, data wrangling, and automation is required. Additionally, you must have a solid understanding of data modeling, data warehousing, and healthcare data standards such as HL7, FHIR, ICD, and CPT. Familiarity with HIPAA compliance and healthcare data privacy/security practices is expected. Experience in designing and implementing cloud-based data architectures, including platforms like OCI, AWS, and Azure, will be advantageous. Strong expertise in SQL and database optimization, with experience in Oracle, PostgreSQL, or similar databases, will also be beneficial for this role.,

Posted 1 day ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Be a part of a dynamic team and excel in an environment that values diversity and creativity. Continue to sharpen your skills and ambition while pushing the industry forward. As a Data Architect at JPMorgan Chase within the Employee Platforms, you serve as a seasoned member of a team to develop high-quality data architecture solutions for various software applications and platforms. By incorporating leading best practices and collaborating with teams of architects, you are an integral part of carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will be responsible for designing and implementing data models that support our organization's data strategy. You will work closely with Data Product Managers, Engineering teams, and Data Governance teams to ensure the delivery of high-quality data products that meet business needs and adhere to best practices. Job responsibilities include: - Executing data architecture solutions and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions and break down problems. - Collaborating with Data Product Managers to understand business requirements and translate them into data modeling specifications. Conducting interviews and workshops with stakeholders to gather detailed data requirements. - Creating and maintaining data dictionaries, entity-relationship diagrams, and other documentation to support data models. - Producing secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. - Evaluating data architecture designs and providing feedback on recommendations. - Representing the team in architectural governance bodies. - Leading the data architecture team in evaluating new technologies to modernize the architecture using existing data standards and frameworks. - Gathering, analyzing, synthesizing, and developing visualizations and reporting from large, diverse data sets in service of continuous improvement of data frameworks, applications, and systems. - Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture. - Contributing to data architecture communities of practice and events that explore new and emerging technologies. Required qualifications, capabilities, and skills: - Formal training or certification in Data Architecture and 3+ years of applied experience. - Hands-on experience in data platforms, cloud services (e.g., AWS, Azure, or Google Cloud), and big data technologies. - Strong understanding of database management systems, data warehousing, and ETL processes. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Knowledge of data governance principles and best practices. - Ability to evaluate current technologies to recommend ways to optimize data architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming and database querying languages. - Overall knowledge of the Software Development Life Cycle. - Solid understanding of agile methodologies such as continuous integration and delivery, application resiliency, and security. - Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills: - Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with big data technologies (e.g., Hadoop, Spark). - Certification in data modeling or data architecture.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

We are seeking a skilled Data Analyst with exceptional communication abilities and in-depth proficiency in SQL, Tableau, and contemporary data warehousing technologies. As a Data Analyst, you will be responsible for designing data models, creating insightful dashboards, ensuring data quality, and extracting valuable insights from extensive datasets to aid strategic business decisions. Your primary responsibilities will include writing advanced SQL queries to extract and manipulate data from cloud data warehouses like Snowflake, Redshift, or BigQuery. You will design and implement data models that cater to analytical and reporting requirements, as well as develop dynamic, interactive dashboards and reports utilizing tools such as Tableau, Looker, or Domo. Additionally, you will engage in advanced analytics techniques like cohort analysis, time series analysis, scenario analysis, and predictive analytics. Ensuring data accuracy through thorough quality assurance checks, investigating data issues, and collaborating with BI or data engineering teams for root cause analysis will also be part of your role. Effective communication of analytical insights to stakeholders is crucial in this position. The ideal candidate must possess excellent communication skills, have at least 5 years of experience in data analytics, BI analytics, or BI engineering roles, and exhibit expert-level proficiency in SQL. Proficiency in data visualization tools like Tableau, Looker, or Domo is essential, along with a strong grasp of data modeling principles and best practices. Hands-on experience with cloud data warehouses such as Snowflake, Redshift, BigQuery, SQL Server, or Oracle is required. Intermediate-level proficiency in spreadsheet tools like Excel, Google Sheets, or Power BI is necessary, including functions, pivots, and lookups. A Bachelor's or advanced degree in a relevant field like Data Science, Computer Science, Statistics, Mathematics, or Information Systems is preferred. The ability to collaborate with cross-functional teams, including BI engineers, to enhance reporting solutions is vital. Experience in managing large-scale enterprise data environments is advantageous, and familiarity with data governance, data cataloging, and metadata management tools is a plus. This is a full-time position with benefits such as health insurance, paid time off, and Provident Fund. The work schedule is Monday to Friday, and the job requires in-person presence. Education requirements include a Bachelor's degree, and candidates should have at least 5 years of experience in data analytics and 2 years of experience with Tableau. Job Type: Full-time Benefits: - Health insurance - Paid time off - Provident Fund Schedule: Monday to Friday Education: Bachelor's (Required) Experience: - Data analytics: 5 years (Required) - Tableau: 2 years (Required) Work Location: In person,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

As a PowerBI Developer, you will be responsible for developing and maintaining scalable data pipelines using Python and PySpark. You will collaborate with data engineers and data scientists to fulfill data processing needs and optimize existing PySpark applications for performance improvements. Writing clean, efficient, and well-documented code following best practices is a crucial part of your role. Additionally, you will participate in design and code reviews, develop and implement ETL processes, and ensure data integrity and quality throughout the data lifecycle. Staying current with the latest industry trends and technologies in big data and cloud computing is essential. The ideal candidate should have a minimum of 6 years of experience in designing and developing advanced Power BI reports and dashboards. Working experience on data modeling, DAX calculations, developing and maintaining data models, creating reports and dashboards, analyzing and visualizing data, ensuring data governance and compliance, as well as troubleshooting and optimizing Power BI solutions. Preferred skills for this role include strong proficiency in Power BI Desktop, DAX, Power Query, and data modeling. Experience in analyzing data, creating visualizations, building interactive dashboards, connecting to various data sources, and transforming data is highly valued. Excellent communication and collaboration skills are necessary to work effectively with stakeholders. Familiarity with SQL, data warehousing concepts, and experience with UI/UX development would be beneficial.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Pricing Revenue Growth Consultant, your primary role will be to advise on building a pricing and promotion tool for a Consumer Packaged Goods (CPG) client. This tool will encompass pricing strategies, trade promotions, and revenue growth initiatives. You will be responsible for developing analytics and machine learning models to analyze price elasticity, promotion effectiveness, and trade promotion optimization. Collaboration with CPG business, marketing, data scientists, and other teams will be essential for the successful delivery of the project and tool. Your Business Domain Skills will be crucial in this role, including expertise in Trade Promotion Management (TPM), Trade Promotion Optimization (TPO), Promotion Depth Frequency Forecasting, Price Pack Architecture, Competitive Price Tracking, Revenue Growth Management, and Financial Modeling. Additionally, you will need proficiency in AI, Machine Learning for Pricing, and Dynamic pricing implementation. Key Responsibilities: - Utilize Consulting Skills for hypothesis-driven problem solving, Go-to-Market pricing, and revenue growth execution. - Conduct Advisory Presentations and Data Storytelling. - Provide Project Leadership and Execution. In terms of Technical Requirements, you should possess: - Proficiency in programming languages such as Python and R for data manipulation and analysis. - Expertise in machine learning algorithms and statistical modeling techniques. - Familiarity with data warehousing, data pipelines, and data visualization tools like Tableau or Power BI. - Experience in Cloud platforms like ADF, Databricks, Azure, and their AI services. Your Additional Responsibilities will include: - Working collaboratively with cross-functional teams across sales, marketing, and product development. - Managing stakeholders and leading teams. - Thriving in a fast-paced environment focused on delivering timely insights to support business decisions. - Demonstrating excellent problem-solving skills and the ability to address complex technical challenges. - Communicating effectively with cross-functional teams and stakeholders. - Managing multiple projects simultaneously and prioritizing tasks based on business impact. Qualifications: - A degree in Data Science or Computer Science with a specialization in data science. - A Master's in Business Administration and Analytics is preferred. Preferred Skills: - Experience in Technology, Big Data, and Text Analytics.,

Posted 1 day ago

Apply

2.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Brillio, you will play a crucial role in leveraging your expertise in data modeling, particularly with tools like ER Studio and ER Win. Your specialization lies in transforming disruptive technologies into a competitive advantage for Fortune 1000 companies through innovative digital adoption. Brillio prides itself on being a rapidly growing digital technology service provider that excels in integrating cutting-edge digital skills with a client-centric approach. As a preferred employer, Brillio attracts top talent by offering opportunities to work on exclusive digital projects and engage with groundbreaking technologies. To excel in this role, you should have over 10 years of IT experience, with a minimum of 2 years dedicated to Snowflake and hands-on modeling experience. Your proficiency in Data Lake/ODS, ETL concepts, and data warehousing principles will be critical in delivering effective solutions to clients. Collaboration with clients to drive both physical and logical model solutions will be a key aspect of your responsibilities. Your technical skills should encompass advanced data modeling concepts, experience in modeling large volumes of data, and proficiency in tools like ER Studio. A solid grasp of database concepts, including data warehouses, reporting, and analytics, will be essential. Familiarity with platforms like SQLDBM and expertise in entity relationship modeling will further strengthen your profile. Moreover, your communication skills should be excellent, enabling you to lead teams effectively and facilitate seamless collaboration with clients. While exposure to AWS ecosystems is a plus, your ability to design and administer databases, develop SQL queries for analysis, and implement data modeling for star schema designs will be instrumental in your success as a Principal Data Engineer at Brillio.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Data Engineering Architect at Iris Software, you will play a crucial role in leading enterprise-level data engineering projects on public cloud platforms like AWS, Azure, or GCP. Your responsibilities will include engaging with client managers to understand their business needs, conceptualizing solution options, and finalizing strategies with stakeholders. You will also be involved in team building, delivering Proof of Concepts (PoCs), and enhancing competencies within the organization. Your role will focus on building competencies in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, and Data Governance. Staying updated with the latest tools, best practices, and trends in the Data and Analytics field will be essential to drive innovation and excellence in your work. To excel in this position, you should hold a Bachelor's or Master's degree in a Software discipline and have extensive experience in Data architecture and implementing large-scale Data Lake/Data Warehousing solutions. Your background in Data Engineering should demonstrate leadership in solutioning, architecture, and successful project delivery. Strong communication skills in English, both written and verbal, are essential for effective collaboration with clients and team members. Proficiency in tools such as AWS Glue, Redshift, Azure Data Lake, Databricks, Snowflake, and databases, along with programming skills in Spark, Spark SQL, PySpark, and Python, are mandatory competencies for this role. Joining Iris Software offers a range of perks and benefits designed to support your financial, health, and overall well-being. From comprehensive health insurance and competitive salaries to flexible work arrangements and continuous learning opportunities, we are dedicated to providing a supportive and rewarding work environment where your success and happiness are valued. If you are inspired to grow your career in Data Engineering and thrive in a culture that values talent and personal growth, Iris Software is the place for you. Be part of a dynamic team where you can be valued, inspired, and encouraged to be your best professional and personal self.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As an ETL Developer, you will play a key role in supporting the design, development, and maintenance of enterprise data integration solutions. Your main responsibilities will include designing, developing, and implementing ETL workflows using SSIS and/or Informatica PowerCenter. You will be expected to extract, transform, and load data from various sources such as SQL Server, Oracle, flat files, APIs, Excel, and cloud platforms. Furthermore, you will need to optimize existing ETL processes for improved performance, reliability, and scalability. Unit testing, integration testing, and data validation will be crucial to ensure data quality and consistency. Maintaining technical documentation for ETL processes, mappings, and workflows is also an essential part of your role. Collaboration with data architects, BI analysts, and business stakeholders will be necessary to understand data requirements and deliver clean, structured data solutions. Monitoring daily data loads, resolving ETL failures promptly, and ensuring data security, integrity, and compliance are additional responsibilities. Your involvement in code reviews, peer testing, and production deployment activities will be vital for the success of projects. Your technical skills should include strong hands-on experience in SSIS and/or Informatica PowerCenter development, proficient SQL programming abilities, and familiarity with ETL performance tuning and error handling. Knowledge of data modeling concepts, data warehousing principles, and familiarity with slowly changing dimensions (SCDs) is essential. Exposure to source control systems, job schedulers, cloud-based data platforms, and understanding of data governance and compliance standards will be advantageous. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 3-5 years of relevant experience in ETL development using SSIS and/or Informatica. Strong problem-solving skills, analytical thinking, excellent communication abilities, and the capacity to work both independently and in a team-oriented environment are required. Preferred certifications such as Microsoft Certified: Azure Data Engineer Associate, Informatica PowerCenter Developer Certification, or any SQL/BI/ETL-related certifications would be beneficial but are optional.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Teradata ETL Developer, you will be responsible for designing, developing, and implementing ETL processes using Teradata tools like BTEQ and TPT Utility. Your role will involve optimizing and enhancing existing ETL workflows to improve performance and reliability. Collaboration with cross-functional teams to gather data requirements and translate them into technical specifications will be a key aspect of your responsibilities. Data profiling, cleansing, and validation will also be part of your duties to ensure data quality and integrity. Monitoring ETL processes, troubleshooting any issues in the data pipeline, and participating in the technical design and architecture of data integration solutions are critical tasks you will perform. Additionally, documenting ETL processes, data mapping, and operational procedures for future reference and compliance will be essential. To excel in this role, you should possess proven experience as a Teradata ETL Developer with a strong understanding of BTEQ and TPT Utility. A solid grasp of data warehousing concepts, ETL methodologies, and data modeling is required. Proficiency in SQL, including the ability to write complex queries for data extraction and manipulation, is essential. Familiarity with data integration tools and techniques, especially in a Teradata environment, will be beneficial. Strong analytical and problem-solving skills are necessary to diagnose and resolve ETL issues efficiently. You should be able to work collaboratively in a team environment while also demonstrating self-motivation and attention to detail. Excellent communication skills are a must to effectively engage with both technical and non-technical stakeholders.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Staff Cloud Support Engineer at Snowflake, you will be a crucial part of the Snowflake Support team, dedicated to providing high-quality resolutions to help customers achieve data-driven business insights and results. You will work with a team of subject matter experts to ensure customer success by listening, learning, and building strong connections with customers. Your responsibilities will include working on a variety of technical issues related to operating systems, database technologies, big data, data integration, connectors, and networking. Customers will rely on you for technical guidance and expert advice on the effective and optimal use of Snowflake Data Warehouse. You will also be the voice of the customer, providing valuable product feedback and suggestions for improvement to Snowflake's product and engineering teams. In addition to providing exceptional service to customers, you will play a key role in building knowledge within the team and contributing to strategic initiatives for organizational and process improvements. Depending on business needs, you may work with Snowflake Priority Support customers, understanding their use cases and helping them achieve the highest levels of continuity and performance from their Snowflake implementation. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science or equivalent discipline, along with at least 8 years of experience in a Technical Support environment or a similar customer-facing technical role. You should possess excellent writing and communication skills in English, attention to detail, and the ability to work collaboratively across global teams. As a Staff Cloud Support Engineer, you will drive technical solutions to complex problems, adhere to response and resolution SLAs, and demonstrate strong problem-solving skills. You will utilize the Snowflake environment, connectors, and third-party partners for investigating issues, document solutions, and submit well-documented bugs and feature requests. Additionally, you will proactively identify recommendations for product quality improvement, customer experience enhancement, and team efficiencies. It is essential for you to have a clear understanding of data warehousing fundamentals and concepts, debug and troubleshoot complex SQL queries, and have strong knowledge of RDBMS, SQL data types, aggregations, and functions. Experience with database migration, ETL, scripting/coding in any programming language, and working knowledge of semi-structured data is also required. Proficiency in interpreting system performance metrics and understanding cloud service providers" ecosystems is beneficial. If you have experience working with distributed databases, troubleshooting various operating systems, understanding networking fundamentals, cloud computing security concepts, and proficiency in scripting languages such as Python and JavaScript, it would be a plus. Snowflake is looking for individuals who share their values, challenge conventional thinking, and drive innovation while contributing to the company's growth and success.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Healthcare Business Analyst at CitiusTech, you will be a part of an Agile team designing and building healthcare applications, implementing new features, and ensuring adherence to the best coding development standards. Your responsibilities will include delivering technical preliminary design documents, conducting detailed analysis of data systems to solve complex business problems in an agile environment, providing consulting support for IT and Business partners, meeting defined deadlines with a high level of quality, creating system test plans and test data, participating in deliverables required by approved Development Lifecycles, creating various types of documentation, performing testing, and adhering to IT and corporate policies, procedures, and standards. With 7-8 years of experience, you will be based in either Mumbai, Pune, or Chennai. An Engineering Degree (BE/ME/BTech/MTech/BSc/MSc) and technical certification in multiple technologies are required. Relevant industry-recognized certifications related to project management such as CSPO, PMP, Agile PM, SAFe are desirable. Mandatory technical skills include US Healthcare domain knowledge, strong SQL knowledge, experience in data warehouse and data management projects, collaboration with DBA and DB developers, working on creating BRD, FRDs, UML, and flow diagrams, facilitating business requirement elicitation sessions, identifying potential issues and risks, and more. Good attitude, experience in Agile model, excellent communication skills, and adherence to departmental policies and procedures are essential. Good to have skills include experience as a Development/Data Analyst, Data Warehousing, working with tools like Microsoft Project, Jira, and Confluence, strategic thinking, and knowledge of vulnerability and security domain. CitiusTech is committed to combining IT services, consulting, products, accelerators, and frameworks with a client-first mindset and next-gen tech understanding to humanize healthcare and make a positive impact on human lives. The company values Passion, Respect, Openness, Unity, and Depth (PROUD) of knowledge, creating a fun, transparent, non-hierarchical, diverse work culture focused on continuous learning and work-life balance. Rated as a Great Place to Work, CitiusTech offers comprehensive benefits to ensure a long and rewarding career. The EVP "Be You Be Awesome" reflects the company's efforts to create a great workplace supporting employee growth, well-being, and success. By collaborating with global leaders at CitiusTech, you will have the opportunity to shape the future of healthcare and positively impact human lives.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

You will be joining Beyond Key, a Microsoft Gold Partner and a Great Place to Work-certified company that prioritizes the happiness of both team members and clients. Established in 2005, Beyond Key is an international IT consulting and software services firm known for delivering cutting-edge services and products to meet the global needs of their clients across various regions such as the United States, Canada, Europe, Australia, the Middle East, and India. With a team of over 350+ skilled software professionals, Beyond Key creates and designs IT solutions tailored to their clients" requirements. For more information, visit https://www.beyondkey.com/about. As a Snowflake DevOps Engineer within the BI TEC team, your primary responsibility will be to support and enhance a multi-region Snowflake data warehouse infrastructure. This role will involve developing and maintaining robust CI/CD pipelines using tools like GitHub, Git Actions, Python, TeamCity, and SDA. Proficiency in Control-M for batch scheduling and a solid background in data warehousing are crucial for this position. Collaboration with cross-functional technical teams and a proactive delivery approach are essential aspects of this role. While experience in the Broker Dealer domain is advantageous, a proven track record in managing large-scale data warehouse projects will also be highly valued. Key Responsibilities: - Develop and maintain CI/CD pipelines for Snowflake. - Collaborate with different teams to improve deployment and automation processes. - Manage batch scheduling using Control-M. - Ensure quality and security compliance, including conducting Veracode scan reviews. - Contribute to data warehouse design following Kimball methodologies. - Translate technical concepts into easily understandable language for business purposes. - Provide support for production reporting and be available for on-call support when necessary. Required Skills & Experience: - Minimum 5 years of experience in Snowflake CI/CD. - Minimum 5 years of Python development experience. - Proficiency in GitHub, Git Actions, TeamCity, and SDA. - Strong understanding of Data Warehousing and Kimball methodology. - Experience with Control-M for batch processing and job scheduling. - Familiarity with Veracode or similar security scanning tools. - Experience working in large-scale database development teams. - Knowledge of Capital Markets or Broker Dealer domain (preferred). - Oracle PL/SQL experience is a plus. If you are seeking a role where you can contribute to innovative data solutions and work collaboratively with a dynamic team, this opportunity at Beyond Key may be perfect for you. Explore all our job openings and share this opportunity with someone exceptional.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will be joining Viraaj HR Solutions Private Limited, a trusted HR partner with over 4 years of experience in delivering seamless services to a diverse clientele across India. Our commitment to high integrity, transparency, and efficiency ensures a smooth and rewarding experience for both clients and candidates. We conduct business in an appropriate, ethical, and transparent manner, adapting to the ever-evolving commercial, regulatory, and compliance landscape. As a full-time on-site US Taxation Manager (Partnership Form-1065) based in Bengaluru, you will play a crucial role in managing all aspects of partnership taxation. Your responsibilities will include preparing and reviewing Form-1065, tax planning, compliance, research, and analysis. Collaboration with various teams will be essential to ensure accurate and timely tax filings and to provide necessary tax advisory services. To excel in this role, you should have experience in Data Engineering and Data Modeling, proficiency in Extract Transform Load (ETL) and Data Warehousing, and strong skills in Data Analytics. A deep understanding of US tax laws and regulations, particularly partnership taxation, is crucial. Your excellent analytical and problem-solving abilities will be key to success, along with the capacity to work both independently and collaboratively within a team environment. A Bachelor's degree in Accounting, Finance, or a related field is required, and a CPA certification would be advantageous. Prior experience in tax planning, compliance, and research will also be beneficial.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm that is dedicated to delivering outcomes that shape the future. With a team of over 125,000 professionals across more than 30 countries, we are motivated by curiosity, entrepreneurial agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, and we serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently looking for a Principal Consultant - Snowflake Sr. Data Engineer (Snowflake + Python/Pyspark) to join our team! As a Snowflake Sr. Data Engineer, you will be responsible for providing technical direction and leading a group of developers to address a common goal. You should have experience in the IT industry and be proficient in building productionized data ingestion and processing data pipelines in Snowflake. Additionally, you should be well-versed in data warehousing concepts and have expertise in Snowflake features and integration with other data processing tools. Experience with Python programming and Pyspark for data analysis is essential for this role. Key Responsibilities: - Work on requirement gathering, analysis, designing, development, and deployment - Write SQL queries against Snowflake and develop scripts for Extract, Load, and Transform data - Understand Data Warehouse concepts and Snowflake Architecture - Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, tables, Tasks, Streams, and more - Experience with Snowflake AWS data services or Azure data services - Proficiency in Python programming language and knowledge of packages like pandas, NumPy, etc. - Design and develop efficient ETL jobs using Python and Pyspark - Use Python and Pyspark for data cleaning, pre-processing, and transformation tasks - Implement CDC or SCD type 2 and build data ingestion pipelines - Work with workflow management tools for scheduling and managing ETL jobs Qualifications: - B.E./ Masters in Computer Science, Information Technology, or Computer Engineering - Relevant years of experience as a Snowflake Sr. Data Engineer - Skills in Snowflake, Python/Pyspark, AWS/Azure, ETL concepts, Airflow, or any orchestration tools, Data Warehousing concepts If you are passionate about leveraging your skills to drive innovative solutions and create value in a dynamic environment, we encourage you to apply for this exciting opportunity. Join us in shaping the future and making a difference!,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for identifying and managing the client's functional needs throughout the project's development and execution, ensuring compliance with the company's quality regulations. This involves understanding the client's operational characteristics, specifying solution requirements, and assessing the feasibility of system adaptations based on the client's business features. It is essential to stay updated on new technologies and products to facilitate continuous learning. Your primary tasks will include surveying, analyzing, and documenting various processes, technical requirements, and business needs. You will validate design models, conduct user and supplier interviews, review estimates, and specify functional designs of use cases. Additionally, you will be responsible for issuing procedures, creating and maintaining documentation on operational circuits and systems for analysis and enhancement, assembling tests, providing user training, and identifying the necessity for new systems or proposing enhancements. As part of your role, you must be prepared to work flexible shifts, including S3 and night shifts. Your expertise in modules such as Supply (Purchases and Inventories), Manufacturing (PDM and SFC), and Costs (JC) within the ERP JD Edwards will be crucial. Proficiency in formal analysis and development methodologies, UML, SQL, data warehousing, testing tools, and office tools is required. Preferred skills for this position include teamwork, analytical capabilities, attention to detail, effective oral and written communication, a user-centric approach, commitment, and the ability to impart knowledge for individuals" development. Overall, you will play a vital role in implementing solutions in collaboration with the development team, providing post-implementation support, and generating reports to ensure the project's success and client satisfaction.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

Adient is a leading global automotive seating supplier, supporting all major automakers in the differentiation of their vehicles through superior quality, technology, and performance. We are seeking a Sr. Data Analytics Lead to help build Adients data and analytics foundation, directly benefitting our internal business units, and our Consumers. You are self-motivated and data-curious, especially about how data can be used to optimize business opportunities. In this role you will own projects end-to-end, from conception to operationalization, demonstrating your comprehensive understanding of the full data product development lifecycle. You will employ various analytical techniques to solve complex problems, drive scalable cloud data architectures, and deliver data products to enhance decision making across the organization. In this role, you will also own the technical support for released applications being used by internal Adient teams. This includes the daily triage of problem tickets and change requests. You will have 2-3 developer direct reports to accommodate this support as well as new development. The successful candidate can lead medium to large scale analytics projects requiring minimal direction, is highly proficient in SQL and cloud-based technologies, has good communication skills, takes the initiative to explore and tackle problems, and is an effective people leader. The ideal candidate will be working within Adients Advanced Analytics team. You will be a part of an empowered, highly capable team collaborating with Business Relationship Managers, Product Owners, Data Engineers, Production Support, and Visualization Developers within multiple business units to understand the data analytics needs and translate those requirements into world-class solution architectures. You will lead and mentor a team of solution architects to research, analyze, implement, and support scalable data product solutions that power Adients analytics across the enterprise, and deliver on business priorities. Own technical support for released internal analytics applications. This includes the daily triage of problem tickets and change requests. Lead development and execution of reporting and analytics products to enable data-driven business decisions that will drive performance and lead to the accomplishment of annual goals. You will be leading, hiring, developing, and evolving the Analytics team and providing them technical direction with the support of other leads and architects. Understand the road ahead and ensure the team has the skills and tools necessary to succeed. Drive the team to develop operationally efficient analytic solutions. Manage resources/budget and partner with functional and business teams. Advocate sound software development practices and help develop and evangelize great engineering and organizational practices. You will be leading the team that designs and builds highly scalable data pipelines using new generation tools and technologies like Azure, Snowflake, Spark, Databricks, SQL, Python to induct data from various systems. Work with product owners to ensure priorities are understood and direct the team to support the vision of the larger Analytics organization. Translate complex business problem statements into analysis requirements and work with internal customers to define data product details based on expressed partner needs. Work closely with business and technical teams to deliver enterprise-grade datasets that are reliable, flexible, scalable, and provide low cost of ownership. Develop SQL queries and data visualizations to fulfill internal customer application reporting requirements, as well as ad-hoc analysis requests using tools such as PowerBI. Thoroughly document business requirements, data architecture solutions, and processes for business and technical audiences. Serve as a domain specialist on data and business processes within your area of focus and find solutions to operational or data issues in the data pipelines. Grow the technical ability of the team. QUALIFICATIONS - Bachelors Degree or Equivalent with 8+ years of experience in data engineering, computer science, or statistics field with at least 2+ years of experience in leadership/management. - Experience in developing Big Data cloud-based applications using the following technologies: SQL, Azure, Snowflake, PowerBI. - Experience building complex ADF data pipelines and Data Flows to ingest data from on-prem sources, transform, and sink into Snowflake. Good understanding of ADF pipelining Activities. - Familiar with various Azure connectors to establish on-prem data-source connectivity, as well as Snowflake data-warehouse connectivity over private network. - Lead/Work with hybrid teams, communicate effectively, both written and verbal, with technical and non-technical multi-functional teams. - Translate complex business requirements into scalable technical solutions meeting data warehousing design standards. Solid understanding of analytics needs and proactive-ness to build generic solutions to improve efficiency. - Experience with data visualization and dashboarding techniques to make complex data more accessible, understandable, and usable to drive business decisions and outcomes. Efficient in PowerBI. - Extensive experience in data architecture, defining and maintaining data assets, and developing data architecture strategies to support reporting and data visualization tools. - Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements. - Thrives in a dynamic environment, keeping composure and a positive attitude. - A plus if your experience was in distribution or manufacturing organizations. PREFERRED - Experience with Snowflake cloud data warehouse. - Experience with Azure PaaS services. - Experience with TSQL, SQL Server, Azure SQL, Snowflake SQL, Oracle SQL. - Experience with Azure Storage account connectivity. - Experience developing visualizations with PowerBI and BusinessObjects. - Experience with Databricks. - Experience with ADLS Gen2. - Experience with Azure VNet private endpoints on a private network. - Proficient with Spark and Python. - Advanced proficiency in SQL, joining multiple data sets across different data grains, query optimization, pivoting data. - MS Azure Certifications. - Snowflake Certifications. - Experience with other leading commercial Cloud platforms like AWS. - Experience with installing and configuring ODBC, JDBC drivers on Windows. - Candidate resides in the Plymouth MI area. PRIMARY LOCATION Pune Tech Center.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You will be part of a data analytics services company that specializes in creating and managing scalable data platforms for a diverse client base. Leveraging cutting-edge technologies, you will provide actionable insights and value through modern data stack solutions. Your responsibilities will include designing, building, and managing customer data platforms independently using Snowflake, dbt, Fivetran, and SQL. Collaborating with clients and internal teams to gather business requirements and translating them into reliable data solutions will be a key aspect of your role. You will also develop and maintain ELT pipelines with Fivetran and dbt for automating data ingestion, transformation, and delivery. Optimizing SQL code and data models for scalability, performance, and cost efficiency in Snowflake will be crucial. Additionally, ensuring data platform reliability, monitoring, and data quality maintenance will be part of your responsibilities. You will also provide technical mentorship and guidance to junior engineers and maintain comprehensive documentation of engineering processes and architecture. The required skills and qualifications for this role include proven hands-on experience with Snowflake, dbt, Fivetran, and SQL. You should have a strong understanding of data warehousing concepts, ETL/ELT best practices, and modern data stack architectures. Experience in working independently and owning project deliverables end-to-end is essential. Familiarity with version control systems like Git and workflow automation tools, along with solid communication and documentation skills, is necessary. You should also be able to interact directly with clients and understand their business requirements. Preferred skills that would be beneficial for this role include exposure to cloud platforms like AWS, GCP, and Azure, knowledge of Python or other scripting languages for data pipelines, and experience with BI/analytics tools such as Tableau, Power BI, and Looker. In return, you will have the opportunity to lead the implementation of state-of-the-art data platforms for global clients in a dynamic, growth-oriented work environment with flexible working arrangements and a competitive compensation package. If you are interested in this opportunity, please submit your resume and a short cover letter detailing your experience with Snowflake, dbt, Fivetran, and SQL.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You should have expert-level business-process knowledge associated with one or more of the following SAP functional modules: MM, SD, PM, PP, PS. Your responsibilities will include working with the LTMC Load method, debugging errors, and extensive LTMOM code development. Strong hands-on knowledge of LSMW, debugging, and loading of data is essential. Basic ABAP knowledge will be an advantage for debugging and LTMOM coding tasks. A minimum of 4+ years of experience in SAP data migration projects is required, and experience with Syniti ADM/ADMM will be advantageous. You will be responsible for loading data through LTMC/LSMW and overseeing end-to-end SAP S4/HANA data migration activities. Proficiency with SAP screens & Tcodes is necessary, and knowledge of Excel formulas is preferred. Your role will involve extensive experience in data quality and data migration, including proficiency in data warehousing, data analysis, and conversion planning for data migration activities. Proficiency in Microsoft SQL is preferred, along with SQL query skills, a comprehensive understanding of SQL table structure, and knowledge of relational databases. As a lead, you will guide the team according to project requirements and understand client requirements. Communication with onsite teams and client personnel is vital, and you will be responsible for driving blueprint sessions and creating/maintaining SAP Data Migration Plan, Integration Plan, and Cutover Plan. Additional responsibilities include SAP Data Extraction, Transformation, and Loading, as well as understanding and executing change management processes.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Retail Sell Out Consultant, you will collaborate with CPG, FMCG businesses, data engineers, and other teams to ensure successful project delivery and tool implementation. You will need to possess a combination of business domain skills, technical expertise, and consulting skills to excel in this role. Your responsibilities will include engaging with various stakeholders (both non-technical and technical) at the client side, interpreting problem statements and use cases, and devising feasible solutions. You will be tasked with understanding different types of retail data, designing data models including Fact & Dimension table structures, and driving data load & refresh strategies. In addition, you will work on designing TradeEdge Interface specifications, collaborating with developers for data conversion, preparing calculation logics documents, and actively participating in User Acceptance Testing (UAT). Your proficiency in SQL, Power BI, data warehousing, and data pipelines will be crucial for data manipulation and analysis. Experience with data visualization tools like Tableau or Power BI, as well as cloud platform services, will also be beneficial. As a Retail Sell Out Consultant, you will be expected to demonstrate strong consulting skills such as advisory, presentation, and data storytelling. You will play a key role in project leadership and execution, working closely with Technical Architects, TradeEdge, and GCP developers throughout the project lifecycle. Your ability to work in an Agile framework and collaborate effectively with cross-functional teams will be essential. The ideal candidate for this role should hold a degree in Engineering with exposure to retail, FMCG, and supply chain management. A deep understanding of the retail domain, including POS sales, inventory management, and related experiences, will be highly valued in this position. In this role, you can expect a collaborative work environment with cross-functional teams, a strong focus on stakeholder management and team handling, and a fast-paced setting aimed at delivering timely insights to support business decisions. Your excellent problem-solving skills, effective communication abilities, and commitment to addressing complex technical challenges will be instrumental in your success as a Retail Sell Out Consultant.,

Posted 2 days ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Job Description: As an integral part of the data engineering team, you will be responsible for onboarding various data sources by creating ETL pipelines. You will provide resolutions and/or workarounds to data pipeline related queries/issues as appropriate. Ensuring that ingestion pipelines empowering the lakehouse and data mesh are up and running will be a key part of your role. You will also enable end users of the data ecosystem with query debugging and optimization. Collaboration with different teams to understand and resolve data availability and consistency issues is essential. Your efforts will focus on ensuring that teams consuming data can do so without spending the majority of their time on acquiring, cleaning, and transforming it. Additionally, you will assist other teams in becoming more independent with data analysis and data quality by coaching them with tools and practices. Continuous improvement in technical knowledge and problem-resolution skills will be expected, with a commitment to strive for excellence. You should apply if you have 1-3 years of experience in ETL and data engineering, possess the ability to read and write complex SQL, have prior experience in Python and Spark, are familiar with data modeling, data warehousing, and lakehouse (utilizing Databricks), have experience working on cloud services, preferably AWS, are dedicated to continuous learning and self-improvement, and can effectively collaborate as a team player with strong analytical, communication, and troubleshooting skills. Key Skills: - Databricks - ETL - AWS Preferred Skill: - MySQL - Python,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Join us to lead data modernization and maximize analytics utility. As a Data Owner Lead at JPMorgan Chase within the Data Analytics team, you play a crucial role in enabling the business to drive faster innovation through data. You are responsible for managing customer application and account opening data, ensuring its quality and protection, and collaborating with technology and business partners to execute data requirements. Your primary job responsibilities include documenting data requirements for your product and coordinating with technology and business partners to manage change from legacy to modernized data. You will have to model data for efficient querying and use in LLMs by utilizing the business data dictionary and metadata. Moreover, you are expected to develop ideas for data products by understanding analytics needs and creating prototypes for productizing datasets. Additionally, developing proof of concepts for natural language querying and collaborating with stakeholders to rollout capabilities will be part of your tasks. You will also support the team in building backlog, grooming initiatives, and leading data engineering scrum teams. Managing direct or matrixed staff to execute data-related tasks will also be within your purview. To be successful in this role, you should hold a Bachelor's degree and have at least 5 years of experience in data modeling for relational, NoSQL, and graph databases. Expertise in data technologies such as analytics, business intelligence, machine learning, data warehousing, data management & governance, and AWS cloud solutions is crucial. Experience with natural language processing, machine learning, and deep learning toolkits (like TensorFlow, PyTorch, NumPy, Scikit-Learn, Pandas) is also required. Furthermore, you should possess the ability to balance short-term goals and long-term vision in complex environments, along with knowledge of open data standards, data taxonomy, vocabularies, and metadata management. A Master's degree is preferred for this position, along with the aforementioned qualifications, capabilities, and skills.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies