Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As an Ignition Application Administrator at EY, you will be a key member of the Enterprise Services Data team. Your role will involve collaborating closely with peer platform administrators, developers, Product/Project Seniors, and Customers to administer the existing analytics platforms. While focusing primarily on Ignition, you will also be cross-trained on other tools such as Qlik Sense, Tableau, PowerBI, SAP Business Objects, and more. Your willingness to tackle complex problems and find innovative solutions will be crucial in this role. In this position, you will have the opportunity to work in a start-up-like environment within a Fortune 50 company, driving digital transformation and leveraging insights to enhance products and services. Your responsibilities will include installing and configuring Ignition, monitoring the platform, troubleshooting issues, managing data source connections, and contributing to the overall data platform architecture and strategy. You will also be involved in integrating Ignition with other ES Data platforms and Business Unit installations. To succeed in this role, you should have at least 3 years of experience in customer success or a customer-facing engineering capacity, along with expertise in large-scale implementations and complex solutions environments. Experience with Linux command line, cloud operations, Kubernetes application deployment, and cloud platform architecture is essential. Strong communication skills, both interpersonal and written, are also key for this position. Ideally, you should hold a BA/BS Degree in technology, computing, or a related field, although relevant work experience may be considered in place of formal education. The position may require flexibility in working hours, including weekends, to meet deadlines and fulfill application administration obligations. Join us at EY and contribute to building a better working world by leveraging data, technology, and your unique skills to drive innovation and growth for our clients and society.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a strong candidate for this role, you should possess 3-5+ years of recent experience working with one or more of the following technologies: Azure Serverless SQL DB, Azure SQL Managed Instance, Azure PostgreSQL Flex Server, Azure MySQL Server, NoSQL, and Snowflake. Your expertise in these areas will be vital in successfully fulfilling the responsibilities of the position.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Solution Architect at DHL Group, a global logistics provider with a workforce of around 600,000 employees spanning over 220 countries and territories, you will play a pivotal role in designing, implementing, and optimizing analytics, data warehousing, and reporting solutions. Your expertise will be essential in ensuring that all solutions meet business requirements, adhere to performance benchmarks, and align with industry standards. Your responsibilities will include leading the design and implementation of analytics and data warehousing solutions, optimizing data pipelines and integrations for accurate and timely data analysis and reporting, conducting data modeling and design to enhance data quality and consistency, collaborating with project teams to define business requirements, and providing technical guidance to development teams, including coding and solution design. Additionally, you will monitor the performance of BI systems and propose improvements to enhance effectiveness while collaborating with cross-functional teams to drive innovation and enhance the organization's data capabilities. To excel in this role, you should have a minimum of 6 years of experience in IT, with at least 4 years in a solution architect role focused on analytics and data warehousing. Proficiency in data modeling, ETL processes, and analytics tools such as Power BI and Snowflake is required. Experience with cloud platforms like AWS and Azure, as well as familiarity with microservices architecture, will be beneficial. Strong analytical and problem-solving skills, excellent verbal and written communication skills, and the ability to explain complex technical concepts to non-technical stakeholders are essential. Experience working in Agile/Scrum environments with a collaborative approach to project delivery is also preferred. At DHL Group, we offer you the opportunity to join a leading global company, be part of a dynamic team, enjoy flexible working hours and remote work options, thrive in an international environment, and benefit from an attractive compensation and benefits package. Join us, make a positive impact, and build an amazing career with DHL Group.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
You should have at least 2 years of professional work experience in implementing data pipelines using Databricks and datalake. A minimum of 3 years of hands-on programming experience in Python within a cloud environment (preferably AWS) is necessary for this role. Having 2 years of professional work experience with real-time streaming systems such as Event Grid and Event topics would be highly advantageous. You must possess expert-level knowledge of SQL to write complex, highly-optimized queries for processing large volumes of data effectively. Experience in developing conceptual, logical, and/or physical database designs using tools like ErWin, Visio, or Enterprise Architect is expected. A minimum of 2 years of hands-on experience working with databases like Snowflake, Redshift, Synapse, Oracle, SQL Server, Teradata, Netezza, Hadoop, MongoDB, or Cassandra is required. Knowledge or experience in architectural best practices for building data lakes is a must for this position. Strong problem-solving and troubleshooting skills are necessary, along with the ability to make sound judgments independently. You should be capable of working independently and providing guidance to junior data engineers. If you meet the above requirements and are ready to take on this challenging role, we look forward to your application. Warm Regards, Rinka Bose Talent Acquisition Executive Nivasoft India Pvt. Ltd. Mobile: +91-9632249758 (INDIA) | 732-334-3491 (U.S.A) Email: rinka.bose@nivasoft.com | Web: https://nivasoft.com/,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
rajasthan
On-site
You will be working as a Snowflake Database Administrator at the Mid Level, providing database and application administration and support for the Information Management Analytical Service. This role involves managing data integration, data warehouse, and business intelligence, including enterprise reporting, predictive analytics, data mining, and self-service solutions. You will collaborate with different teams to offer database and application administration, job scheduling/execution, and code deployment support. Your key responsibilities will include providing database support for Big Data tools, performing maintenance tasks, performance tuning, monitoring, developer support, and administrative support for the application toolset. You will participate in a 24/7 on-call rotation for enterprise job scheduler activities, follow ITIL processes, create/update technical documentation, install/upgrade/configure application toolset, and ensure regular attendance. To qualify for this role, you are required to have a Bachelor's degree or equivalent experience, along with 5 years of work experience in IT. You should have experience in Cloud Database Administration, installing/configuring commercial applications at the OS level, and effective collaboration in a team environment. Preferred skills include scripting in Linux and Windows, experience with Terraform, and knowledge of the insurance and/or reinsurance industry. In terms of technical requirements, you should be proficient in databases such as Snowflake, Vertica, Impala, PostgreSQL, Oracle, SQL Server, operating systems like Unix, Linux, CentOS, Windows, and reporting tools including SAP Business Objects, Tableau, and PowerBI. This position falls under SOW#23 - Snowflake DBA and requires a minimum of 4 years to a maximum of 5 years of experience. Thank you for considering this opportunity.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Experian is seeking a talented Service Quality Engineer to excel in a dynamic, agile environment, supporting projects and developers by leveraging innovative technologies. As a key contributor, you will play a crucial role in ensuring our platform consistently delivers top-notch quality, thereby supporting Experian's commitment to providing exceptional service to consumers. In this role, you will be tasked with investigating the root causes of product issues, evaluating their extent and impact, and collaborating with our Platform and Engineering teams to implement effective solutions, along with establishing testing protocols to prevent regressions. The Service Quality team adopts a holistic approach towards platform service quality, encompassing the entire software development lifecycle and post-deployment phases. The responsibilities of the Service Quality Engineer include managing issues from inception to resolution, maintaining transparent communication with both customers and internal stakeholders throughout the process, all while prioritizing the safeguarding of the consumer-permissioned data managed by our platform. Key Qualifications: - 5 to 8 years of experience supporting a commercial online software-based product, involving troubleshooting, validation, testing, and analysis. - Recent experience with cloud technologies, particularly AWS (Fargate, S3, RDS, Lambda). - Proficiency in log and application management tools like Splunk, Cloudwatch, and DataDog. - Hands-on expertise in test automation tools, frameworks, and principles, focusing on APIs, UI, and performance testing. - Extensive experience in gathering and assessing specifications and requirements. - Ability to thrive in a dynamic environment, adapt to various technologies, and manage multiple projects concurrently. - Strong communication (verbal & written) and negotiation/documentation skills. Additional Preferred Experience: - Supporting product development for financial services businesses. - Assisting data science efforts and machine learning systems. - Building systems to support financial services businesses. - Familiarity with Docker, Kubernetes, and OpenShift. - Experience with security and privacy compliance standards (GPDR, CCPA, ISO 27001, PCI, HIPAA, etc.). - Knowledge and/or use of DDA, FDX, OFX, and/or FIX protocols. - Hands-on experience with Snowflake, Kafka, and ElasticSearch is highly desirable. - Recent experience in developing commercial systems managing PII, secure data, and transactions, including server-side development and REST/JSON APIs. - AWS Certification is a plus. Candidates who may not find this role or team suitable include those who: - Hold pessimistic views towards new technologies. - Prefer theoretical discussions over practical experimentation. - Dislike participating in Halloween-themed activities. - Prefer focusing on one task exclusively at a time. - Disregard deadlines and schedules as unimportant. Experian is committed to fostering a diverse and inclusive culture, prioritizing work-life balance, professional development, authenticity, collaboration, wellness, as well as recognizing and rewarding its employees. With numerous accolades for its people-centric approach, including recognition as a Great Place to Work in 24 countries, FORTUNE Best Companies to Work For, and Glassdoor Best Places to Work globally (4.4 Stars), Experian provides a supportive and engaging environment for its employees. Explore Experian Life on social media or visit our Careers Site to discover more about working at Experian.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Mid-Level Data Engineer at our organization, you will be responsible for designing, developing, and maintaining scalable data pipelines using technologies such as Python and SQL. Your role will involve collaborating with data scientists, analysts, and business stakeholders to ensure that our data infrastructure meets the evolving needs of the organization. Your key responsibilities will include developing and implementing data quality checks, automating data ingestion processes from various sources to Snowflake, and utilizing cloud platforms like AWS, Azure, and GCP to manage data infrastructure. You will also work with Infrastructure as Code (IaC) tools such as Terraform for provisioning cloud resources. In this role, you will collaborate closely with data scientists and analysts to understand data requirements and translate them into technical solutions. Additionally, you will be expected to version control code using Git, document data pipelines and processes for future reference, and engage in knowledge sharing within the team. To qualify for this position, you should have a minimum of 3 years of experience in data engineering or a related field. You must have proven expertise in designing, developing, and deploying data pipelines, along with hands-on experience in Snowflake or Data Bricks, Python or Pyspark, and SQL. Experience with data warehousing, data modeling concepts, and cloud platforms is highly desirable. If you possess working knowledge of Terraform, Infrastructure as Code (IaC) principles, and Git for version control, along with excellent communication, collaboration, problem-solving, and analytical skills, you are an ideal candidate for this role. Candidates with experience in data orchestration tools such as Airflow, Luigi, and data visualization tools like Tableau, Power BI, will be considered as bonus points. Join us in building the infrastructure that drives data-driven decision making and contribute to our team's success!,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The role is seeking a dynamic individual to join the M&R Sales Tech team, bringing expertise in software development of ETL and ELT jobs for the data warehouse software development team. This position plays a crucial role in defining the Design and Architecture during the migration from legacy SSIS technology to cutting-edge cloud technologies such as Azure, Databricks, and Snowflake. The ideal candidate will possess a robust background in Software Architecture, data engineering, and cloud technologies. Key Responsibilities: Architectural Design: Design and implement data architectures of ETL, including creating algorithms, developing data models and schemas, and setting up data pipelines. Technical Leadership: Provide technical leadership to the software development team to ensure alignment of data solutions with business objectives and overall IT strategy. Data Strategy and Management: Define data strategy and oversee data management within the organization, focusing on data governance, quality, privacy, and security using Databricks and Snowflake technologies. Implementation of Machine Learning Models: Utilize Databricks for implementing machine learning models, conducting data analysis, and deriving insights. Data Migration and Integration: Transfer data from on-premise or other cloud platforms to Snowflake, integrating Snowflake and Databricks with other systems for seamless data flow. Performance Tuning: Optimize database performance by fine-tuning queries, enhancing processing speed, and improving data storage and retrieval mechanisms. Troubleshooting and Problem Solving: Identify and resolve issues related to Database, data migration, data pipelines, and other ETL processes, addressing concerns like data quality, system performance, and data security. Stakeholder Communication: Effectively communicate with stakeholders to grasp requirements and deliver solutions that meet business needs. Requirement Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience. Experience: Minimum of 8 years of experience in software development and Architecture role. Technical Skills: Proficiency in ETL/ELT processes and tools, particularly SSIS; 5+ years of experience with large data warehousing applications; solid experience with reporting tools like Power BI and Tableau; familiarity with creating batch and real-time jobs with Databricks and Snowflake, and working with streaming platforms like Kafka and Airflow. Soft Skills: Strong leadership and team management skills, problem-solving abilities, and effective communication and interpersonal skills. Preferred Qualifications: Experience with Agile development methodologies. Certification in relevant cloud technologies (e.g., Azure, Databricks, Snowflake). Primary Skills: Azure, Snowflake, Databricks Secondary Skills: SSIS, Power BI, Tableau Role Purpose: The purpose of the role is to create exceptional architectural solution design and thought leadership, enabling delivery teams to provide exceptional client engagement and satisfaction. Key Roles and Responsibilities: Develop architectural solutions for new deals/major change requests, ensuring scalability, reliability, and manageability of systems. Provide solutioning of RFPs from clients, ensuring overall design assurance. Manage the portfolio of to-be-solutions to align with business outcomes, analyzing technology environment, client requirements, and enterprise specifics. Offer technical leadership in designing, developing, and implementing custom solutions using modern technology. Define current and target state solutions, articulate architectural targets, recommendations, and propose investment roadmaps. Evaluate and recommend solutions for integration with the technology ecosystem. Collaborate with IT groups to ensure task transition, performance, and issue resolution. Enable Delivery Teams by providing optimal delivery solutions, building relationships with stakeholders, and developing relevant metrics to drive results. Manage multiple projects, identify risks, ensure quality assurance, and recommend tools for reuse and automation. Support pre-sales teams in presenting solution designs to clients, negotiate requirements, and demonstrate thought leadership. Competency Building and Branding: Develop PoCs, case studies, and white papers, attain market recognition, and mentor team members for career development. Team Management: Resourcing, Talent Management, Performance Management, Employee Satisfaction and Engagement. Join us at Wipro, a business driven by purpose and reinvention, where your ambitions can be realized through constant evolution and empowerment. Applications from individuals with disabilities are encouraged.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
vadodara, gujarat
On-site
This role involves building, managing, and optimizing data pipelines, views, and models. You will work closely with teams to transition models into production for data and analytics consumers. Additionally, there is an opportunity to engage in visualisation requirements with clients to help them extract maximum value from their data. To be successful in this role, you must have experience in ETL/ELT techniques for integrating into Data Warehouse solutions. A solid understanding of Relational Databases and Data Warehouse methodology, particularly with Azure Data Factory, is essential. You should also possess knowledge of various architectures and methodologies like metadata management, performance management, and data quality management. Proficiency in creating and managing data environments in Snowflake, monitoring the data integration process, and working in a cloud architecture with data lakes is required. Excellent SQL skills are a must to develop and operate efficient, scalable, and reliable data pipelines. The ideal candidate should have working experience with Snowflake and Azure, as well as proficiency in C#, SQL, and Python. Understanding the Kimball dimensional model is desirable, and being a Certified Snowflake SnowPro is a plus. In the context of Snowflake, it is crucial to implement best practice solutions and stay updated with new updates regularly to ensure customers are kept informed. Snowflake undergoes frequent updates, and the data engineering team must stay current with these changes. Regarding Azure, familiarity with Azure Data Factory for big data ingest, Azure Functions (C#) for dynamic scaling, high velocity, middleware, ingest, or API, as well as Azure Storage (tables, blobs, queues), CosmosDB, Virtual Machines, Container Instances, Key Vault, and Azure DevOps is beneficial.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Lead, you will be responsible for designing, developing, and maintaining data pipelines using Python and related technologies. You will lead and mentor a team of data engineers, offering technical guidance and support. Collaboration with cross-functional teams to comprehend data requirements and deliver solutions will be a key aspect of your role. Implementing and managing data quality and validation processes will also be part of your responsibilities. You will work on optimizing data pipelines for enhanced performance and scalability, contributing to the establishment of data engineering best practices and standards. To be successful in this role, you should possess at least 6 years of Python development experience, with a preference for Python 3. Additionally, you should have a minimum of 2 years of experience in leading development teams. A strong working knowledge of Linux CLI environments is essential, along with expertise in data processing using Pandas or Polars. Proven experience in constructing data pipelines and familiarity with general data engineering practices are crucial. Proficiency in database technologies such as Snowflake and ORMs like SQLAlchemy is required. You should also be adept at developing REST APIs in Python, have a solid grasp of Python testing frameworks, and experience with Docker containerization of Python applications. Strong Git version control skills, excellent communication, and leadership abilities are indispensable for this role. Prior experience with Snowflake Database will be an added advantage.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
Genpact is a global professional services and solutions firm committed to delivering outcomes that shape the future. With a team of over 125,000 professionals spanning across 30+ countries, we are fueled by curiosity, agility, and the aspiration to create enduring value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500. We leverage our deep business and industry expertise, digital operations services, and proficiency in data, technology, and AI to drive impactful change. We are currently seeking applications for the position of Assistant Manager - SQL Developer. Responsibilities: - Demonstrated strong proficiency in SQL with relevant experience and the ability to optimize large, complex SQL statements - Capable of planning requirements based on high-level specifications - Proficient in utilizing code versioning tools, managing server resources efficiently, and optimizing performance - Familiarity with the practical application of Python and its integration with SQL - Building, scheduling, monitoring jobs, and troubleshooting issues on Snowflake - Knowledge of ETL processes - Effective communication, adaptability, and collaborative skills Qualifications we seek in you: Minimum Qualifications: - Experience in developing MS-SQL queries and procedures - Ability to create custom reports and modify ERP user forms to enhance organizational productivity and implement automation Skills Needed: - Informatica - Teradata Preferred Qualifications: - Snowflake - Control M - SQL - Python Location: India-Hyderabad Schedule: Full-time Education Level: Bachelor's / Graduation / Equivalent Job Posting: Jan 10, 2025, 3:00:11 AM Unposting Date: Feb 9, 2025, 10:29:00 AM If you are passionate about working in a dynamic environment and possess the required skills and qualifications, we invite you to apply for this exciting opportunity.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
vijayawada, andhra pradesh
On-site
You have a great opportunity as a Power BI Developer in Vijayawada. With over 3 years of experience, you will be responsible for developing advanced dashboards using Power BI. Your expertise in data modeling, design standards, tools, and best practices will be crucial for creating enterprise data models. You should have excellent knowledge of Power BI Desktop, Charting Widgets, and connecting to various data sources. Your role will involve building Power BI reports by leveraging DAX functions. Knowledge of writing SQL statements using MS SQL Server is essential, and experience with ETL, SSAS, and SSIS will be a plus. Familiarity with Power BI Mobile is desired. Having experience with SQL Server or Postgres is a must for this position. Azure experience will be beneficial, and familiarity with Power BI Synapse, Snowflake, Azure Data Lake, and Databricks connectors is an added advantage. This role offers you the opportunity to work with cutting-edge technologies and make a significant impact in the field of data analytics.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Genpact is a global professional services and solutions firm committed to shaping the future. With a workforce of over 125,000 individuals in more than 30 countries, we are dedicated to creating lasting value for our clients through our innate curiosity, entrepreneurial agility, and deep industry knowledge. At Genpact, we serve and transform leading enterprises, including the Fortune Global 500, by leveraging our expertise in digital operations services, data, technology, and AI. We are currently seeking applications for the position of Principal Consultant, Dot Net Developer. In this role, you will play a crucial part in coding, testing, and delivering high-quality deliverables. Additionally, you should be enthusiastic about learning new technologies to enhance your skill set. **Responsibilities:** - Collaborate closely with the business unit and team members globally to understand and document requirements. - Offer innovative solutions to complex business issues using our technology practices. - Develop business tier components and relational database models. - Create interactive web-based user interfaces and integration solutions with 3rd party data providers and systems. - Establish unit/integration/functional tests and contribute to the enhancement of our architecture. - Follow the development process and guidelines, conduct code reviews, troubleshoot production issues, and stay updated on technology trends for recommending improvements. **Qualifications:** **Minimum Qualifications:** - BE/B Tech/MCA - Excellent written and verbal communication skills **Preferred Qualifications/ Skills:** - Bachelor's degree in computer science/computer engineering. - Proficiency in building highly interactive web-based user interfaces using HTML, CSS, JavaScript, AngularJS. - Experience in .Net, .Netcore, C#, SqlServer, Python, Azure Databricks, Snowflake. - Familiarity with building APIs (REST and GraphQL) and Distributed Caching (Redis, Cassandra, etc.). - Working experience with Azure PaaS services and SQL/NoSQL database platforms. - Strong .Net and C# skills for implementing object and service-oriented architecture. - Asp.net core experience for web and API development, OIDC, OAuth2 experience, and building automated test suites. - Experience configuring CI/CD pipelines, effective communication skills, Agile development practices, and familiarity with git source control and GitFlow fundamentals. - Sitecore CMS experience is a plus. **Job Details:** - Job Title: Principal Consultant - Primary Location: India-Bangalore - Schedule: Full-time - Education Level: Bachelor's/Graduation/Equivalent - Job Posting: Oct 4, 2024, 6:11:16 AM - Unposting Date: Nov 3, 2024, 11:59:00 PM **Master Skills List:** Consulting **Job Category:** Full Time,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Data Engineer at our company, you will be responsible for designing and building data models to support business requirements. Your role will involve developing and maintaining data ingestion and processing systems, as well as implementing data storage solutions such as databases and data lakes. Ensuring data consistency and accuracy through data validation and cleansing techniques will be a key part of your responsibilities. You will collaborate with cross-functional teams to identify and address data-related issues, utilizing your proficiency in programming languages like Python and experience with big data and Azure. It is essential to have Azure experience and good familiarity with Snowflake. Knowledge of database management systems, both relational such as MySQL and PostgreSQL, will be advantageous. We are looking for an individual with strong problem-solving and analytical skills, excellent communication, and collaboration abilities. Adaptability and a willingness to learn new technologies and techniques are highly valued in our team. At NucleusTeq, our positive and supportive culture empowers our associates to perform at their best every day. We celebrate individuality and offer flexibility to promote health, well-being, confidence, and awareness. Join us in a culture that promotes excellence and supports healthy, happy lives.,
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You have an exciting opportunity for the position of Data GTM Lead based in Chennai, TN (also open for Bangalore). With over 10 years of experience in the IT industry, including at least 5 years of leading an IT Professional Services Team/Organization, you will be responsible for the following duties and responsibilities: As a leader of the Professional IT Services Team, you will be involved in developing IT Go-to-Market Strategy and Plan, as well as creating Practice Offerings along with descriptive and training materials. You will have experience working against a team-specific revenue plan and managing a virtual onshore IT consulting team of over 10 people while providing virtual leadership to a larger team of over 25 people worldwide. Your role will involve collaborating with Sales teams to market and sell solutions directly to customers. In terms of Delivery Management, you should have experience leading implementations in areas such as MDM, Data Governance, Data Engineering, and Cloud Data Warehousing. You will be responsible for overseeing delivery execution and addressing system-level issues that impact contractual commitments. Your technical knowledge should encompass understanding current high-level IT concepts that influence solution proposals, including Data Fabric, Data Mesh, Medallion-based architectures, and Data Whse/Lake. Familiarity with basic Artificial Intelligence/Machine Learning concepts at both the software component level and the broader enterprise AI/ML capabilities is essential. Direct experience with major Hyper-Scalers (Azure, AWS, GCP) and working with software/SaaS vendors like Informatica, Reltio, Semarchy, Azure Purview/Synapse, Snowflake, and Databricks is required. The qualifications for this role include a minimum College Degree or Equivalent. Additionally, you should be willing to travel domestically and internationally to engage directly with clients and team members. If you meet these qualifications and are excited about this opportunity, we look forward to receiving your application.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The role of Support Engineer requires monitoring and maintaining integration pipelines, data flows, and jobs to ensure system uptime, performance, and stability. You will be responsible for troubleshooting issues promptly and ensuring timely resolution to minimize business impact. Additionally, you will monitor and maintain data models to ensure data accuracy and consistency. Utilizing ITIL best practices, you will efficiently manage incidents to ensure minimal disruption to operations. In Digital Transformation (DT) projects, you will triage incidents to identify and resolve issues promptly. Handling access requests, you will ensure proper authorization and security protocols are followed. For Change and Problem Management, you will raise and manage Change Requests (CRs) for any system modifications or updates. It will be essential to conduct root cause analysis for recurring issues and document Problem Tickets for long-term solutions. Adherence to ITIL processes for managing changes and resolving problems effectively is crucial. Your role will also involve Pipeline Validation and Analysis where you will apply SnapLogic knowledge to troubleshoot issues within SnapLogic pipelines, APIs, and other integration points. Collaboration with stakeholders to understand integration requirements and recommend solutions will be necessary. In terms of Service Delivery and Improvement, you will be responsible for developing, implementing, and maintaining service delivery processes in accordance with ITIL best practices. Identifying opportunities for process improvements and automation to enhance service delivery will be a continuous effort. Providing regular updates and reports on ongoing initiatives to stakeholders and PMO is also a key aspect of the role. Collaboration with team members and stakeholders to understand requirements and provide effective support solutions will be crucial. Communication with stakeholders, including senior management, business users, and other teams, to provide updates on incident status and resolution efforts is essential. Facilitating User Acceptance Testing (UAT) of projects and Change Requests will also be part of your responsibilities. Qualifications required for this role include a Bachelor's degree in computer science, Information Technology, Data Science, or a related field. A minimum of 4 years of experience in a support engineer role with 2 years relevant in SnapLogic is preferred, preferably in the pharmaceutical or related domain. Proven experience in monitoring and maintaining jobs, schedules, and data models is required. Strong hands-on experience with SnapLogic integration platform and proficiency in working with integration technologies is essential. Knowledge of common data formats, various databases, diagnostic and troubleshooting skills, as well as strong ITIL skills are also necessary. Excellent communication and collaboration skills, problem-solving abilities, and organizational skills are key attributes required for this role.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Back-End Developer at our company, you will be responsible for developing an AI-driven prescriptive remediation model for SuperZoom, CBRE's data quality platform. Your primary focus will be on analyzing invalid records flagged by data quality rules and providing suggestions for corrected values based on historical patterns. It is crucial that the model you develop learns from past corrections to continuously enhance its future recommendations. The ideal candidate for this role should possess a solid background in machine learning, natural language processing (NLP), data quality, and backend development. Your key responsibilities will include developing a prescriptive remediation model to analyze and suggest corrections for bad records, implementing a feedback loop for continuous learning, building APIs and backend workflows for seamless integration, designing a data pipeline for real-time processing of flagged records, optimizing model performance for large-scale datasets, and collaborating effectively with data governance teams, data scientists, and front-end developers. Additionally, you will be expected to ensure the security, scalability, and performance of the system in handling sensitive data. To excel in this role, you should have at least 5 years of backend development experience with a focus on AI/ML-driven solutions. Proficiency in Python, including skills in Pandas, PySpark, and NumPy, is essential. Experience with machine learning libraries like Scikit-Learn, TensorFlow, or Hugging Face Transformers, along with a solid understanding of data quality, fuzzy matching, and NLP techniques for text correction, will be advantageous. Strong SQL skills and familiarity with databases such as PostgreSQL, Snowflake, or MS SQL Server are required, as well as expertise in building RESTful APIs and integrating ML models into production systems. Your problem-solving and analytical abilities will also be put to the test in handling diverse data quality issues effectively. Nice-to-have skills for this role include experience with vector databases (e.g., Pinecone, Weaviate) for similarity search, familiarity with LLMs and fine-tuning for data correction tasks, experience with Apache Airflow for workflow automation, and knowledge of reinforcement learning to enhance remediation accuracy over time. Your success in this role will be measured by the accuracy and relevance of suggestions provided for data quality issues in flagged records, improved model performance through iterative learning, seamless integration of the remediation model into SuperZoom, and on-time delivery of backend features in collaboration with the data governance team.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a skilled Python Developer with expertise in Postgres and Snowflake, you will be responsible for developing data-driven applications and implementing robust data solutions. Your role will involve designing and optimizing database schemas in Postgres, working with Snowflake for data warehousing and analytics, and collaborating with cross-functional teams to gather requirements and deliver effective solutions. If you are passionate about data engineering and enjoy solving complex problems, we want to hear from you! Key Responsibilities - Develop and maintain data applications using Python. - Design and optimize database schemas in Postgres. - Work with Snowflake for data warehousing and analytics. - Collaborate with cross-functional teams to gather requirements and deliver effective solutions. - Ensure the performance, quality, and responsiveness of applications. Key Requirements - 3-5 years of experience in Python development. - Strong knowledge of Postgres for database management. - Experience with Snowflake for data warehousing solutions. - Proficient in writing efficient SQL queries. - Familiarity with data modeling and ETL processes. Preferred Qualifications - Knowledge of data visualization tools (e.g., Tableau, Power BI) is a plus. - Understanding of cloud platforms (e.g., AWS, Azure) and data engineering best practices. - Exposure to Agile development methodologies.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a skilled and seasoned Senior Data Engineer to become a valued member of our innovative team. The ideal candidate should possess a solid foundation in data engineering and demonstrate proficiency in Azure, particularly Azure Data Factory (ADF), Azure Fabric, Databricks, and Snowflake. In this role, you will be responsible for the design, construction, and upkeep of data pipelines, ensuring data quality and accessibility, as well as collaborating with various teams to support our data-centric initiatives. Your responsibilities will include crafting, enhancing, and sustaining robust data pipelines utilizing tools such as Azure Data Factory, Azure Fabric, Databricks, and Snowflake. Moreover, you will work closely with data scientists, analysts, and stakeholders to comprehend data requirements, guarantee data availability, and maintain data quality. Implementing and refining ETL processes to efficiently ingest, transform, and load data from diverse sources into data warehouses, data lakes, and Snowflake will also be part of your role. Furthermore, you will play a crucial role in ensuring data integrity and security by adhering to best practices and data governance policies. Monitoring and rectifying data pipelines for timely and accurate data delivery, as well as optimizing data storage and retrieval processes to enhance performance and scalability, will be among your key responsibilities. Staying abreast of industry trends and best practices in data engineering and cloud technologies is essential, along with mentoring and providing guidance to junior data engineers. To qualify for this position, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Additionally, you must have over 5 years of experience in data engineering, with a strong emphasis on Azure, ADF, Azure Fabric, Databricks, and Snowflake. Proficiency in SQL, experience in data modeling and database design, and solid programming skills in Python, Scala, or Java are prerequisites. Familiarity with big data technologies like Apache Spark, Hadoop, and Kafka, as well as a sound grasp of data warehousing concepts and solutions, including Azure Synapse Analytics and Snowflake, are highly desirable. Knowledge of data governance, data quality, and data security best practices, exceptional problem-solving skills, and effective communication and collaboration abilities within a team setting are essential. Preferred qualifications include experience with other Azure services such as Azure Blob Storage, Azure SQL Database, and Azure Cosmos DB, familiarity with DevOps practices and tools for CI/CD in data engineering, and certifications in Azure Data Engineering, Snowflake, or related areas.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Marketing Data Analyst, your primary responsibility will be to analyze data from various marketing channels to evaluate performance, identify trends, and develop insights. You will be tasked with creating detailed reports and dashboards to monitor goals and segment audiences for tailored marketing strategies. Your role will also involve optimizing campaign performance, designing testing strategies to enhance channel effectiveness, and providing business intelligence to support customer lifecycle management and Go-To-Market strategies. Additionally, you will collaborate with Sales, Marketing, and Strategy teams on different programs and deliver clear, data-driven recommendations to enhance overall marketing efforts. Your responsibilities will include analyzing marketing data and supporting campaign targeting by collecting, interpreting, and analyzing data from different marketing channels to identify trends and actionable insights. You will collaborate closely with Sales, Marketing, Strategy, and Customer Success teams to understand their data needs and provide analytical solutions. Evaluating marketing campaign performance, designing testing strategies, and offering recommendations for improvement will also be part of your role. Moreover, you will perform data quality assurance by validating and cleaning up data to ensure its accuracy and reliability for reporting and analysis. Conducting ad-hoc analysis to solve business problems, monitor performance trends, and provide actionable insights will also be essential. Furthermore, you will integrate data from external sources and apply fuzzy matching techniques to compare and merge data for a comprehensive view of customers and prospects. You will report to the Marketing Analytics Manager in India and collaborate with the US team. To be successful in this role, you should have at least 3 years of experience in Data Analysis, proficiency in SQL, Snowflake, and Python on a daily basis, and the ability to work in data analytics, create relevant insights, and collaborate effectively with cross-functional teams. Avalara offers a comprehensive Total Rewards package, including competitive compensation, paid time off, paid parental leave, and eligibility for bonuses. Health & Wellness benefits, such as private medical, life, and disability insurance, are also provided. Avalara strongly supports diversity, equity, and inclusion, integrating these values into its business practices and organizational culture. The company has 8 employee-run resource groups, each with senior leadership and exec sponsorship. Avalara is a dynamic company at the intersection of tax and technology, with a growing industry-leading cloud compliance platform. The company values innovation, disruption, and inclusivity, with a commitment to empowering its employees to succeed. If you are looking for a challenging and rewarding career in a vibrant and supportive environment, consider joining Avalara, where your individuality and achievements are celebrated. Avalara is an Equal Opportunity Employer.,
Posted 3 days ago
8.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining a leading AI-driven Global Supply Chain Solutions Software Product Company, recognized as one of Glassdoor's Best Places to Work. As a Blue Yonder Sr Technical Architect in the Professional Services team, your primary responsibility will be overseeing the services delivery of product implementation and related integrations. You will work closely with a team of 80+ associates in India, with the team expected to grow rapidly. Your role will involve architecting, designing, and leading the technical team on various projects, collaborating with counterparts, customers, and Architects. In this role, you will lead a team of technical consultants and developers on solution implementation, enhancement, managed services, integration, and custom projects, either at customer sites or offshore. Your tasks will include estimating workload, conducting technical analysis sessions, designing and developing platform workflow, providing technical support to the project team, and handling activities with a diverse scope. Additionally, you will provide internal and external trainings, contribute to internal development projects, and identify revenue opportunities within existing customer projects. The ideal candidate will have a Bachelor's degree in Computer Science with 8 to 14 years of experience in the Software industry, preferably starting as a consultant and progressing to leading technical teams on multiple implementation cycles. Strong technical proficiency in programming, problem-solving, performance tuning, PL/SQL, database, ETL, platform, Perl scripting, and enterprise solutions deployment is required. You should have expertise in Blue Yonder SCPO Platform, cloud technologies, architectural design, web technologies, and API fundamentals. Hands-on development skills are essential, along with a passion for learning new technologies and products. Your role will involve continuous learning and upskilling on Blue Yonder's product suite, traveling to customer sites as needed, and working with overlapping time zones. You will be expected to create artifacts for reusability and automation, contributing to decreasing implementation lifecycles and reducing errors post go-live. If you align with our company values and are committed to fostering an inclusive environment, we invite you to explore the opportunity to join our team and drive both our success and the success of our customers.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Visualization professional with a minimum of 6+ years of hands-on experience in Financial Services (Asset Management buy/sell side) domain, you will be responsible for leveraging your expertise in CRMA - CRM Analytics, Tableau CRM Analytics, and Salesforce Tableau CRM Analytics. Your role will involve creating visually appealing and insightful data visualizations to facilitate decision-making processes for the organization. Additionally, proficiency in Snowflake and SQL as secondary skills will be advantageous in this role. The position offers a Hybrid work mode in Mumbai, with a notice period ranging from 1 week to 15 days. If you are a detail-oriented individual with a passion for transforming complex data into actionable insights, this opportunity aligns with your skill set and experience level.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a Senior Software Engineer at our company, you will be a key member of our dynamic team, bringing your extensive expertise in the Microsoft technology stack to the table. Your primary focus will be on data engineering, microservices, event-driven architecture, and Snowflake, as you design, develop, and implement scalable software solutions that align with our business objectives. Your responsibilities will include designing and maintaining robust software applications using the Microsoft tech stack, implementing data engineering solutions with a specific emphasis on Snowflake for data integration and storage, and developing microservices architectures that enhance the scalability, flexibility, and maintainability of our applications. Leveraging event-driven architecture will be crucial to ensuring system responsiveness and real-time data processing, while collaborating with cross-functional teams in an agile environment will be essential to delivering effective solutions. Additionally, you will play a key role in conducting code reviews, mentoring junior developers, and upholding best practices in software development. Troubleshooting and resolving complex technical issues, staying abreast of emerging technologies and industry trends, and continuously driving improvement will also be part of your responsibilities. To succeed in this role, you must hold a Bachelor's degree in Computer Science, Software Engineering, or a related field, and demonstrate proven experience as a Software Engineer with a strong focus on the Microsoft technology stack, including .NET, C#, and Azure. Your expertise in data engineering concepts, particularly with Snowflake, as well as your experience in developing microservices and utilizing containerization technologies like Docker and Kubernetes will be critical. Familiarity with event-driven architecture, messaging systems such as Azure Service Bus and Kafka, SQL, database management, and software development methodologies like Agile and Scrum are also essential. Your problem-solving and analytical skills, along with your proficiency in communication and collaboration, will be valuable assets in this role. Experience with CI/CD pipelines, DevOps practices, cloud services (specifically Azure), and enterprise application integration will further enhance your capabilities. Familiarity with front-end technologies like JavaScript, Angular, and React will be advantageous. Desired skills include experience in financial services or asset management industries, as well as certifications in Azure or relevant data engineering technologies. Your expertise in Azure, C# .NET, Snowflake, and containerization technologies will be instrumental in driving the success of our software solutions.,
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough