Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
0 Lacs
hyderabad, telangana, india
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. Show more Show less
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chandigarh
On-site
As a Senior Data Engineer at Emerson, you will be a key member of the Global BI team supporting the migration to Microsoft Fabric. Your primary responsibility will be to focus on data gathering, modeling, integration, and database design to ensure efficient data management. By developing and optimizing scalable data models, you will play a crucial role in supporting analytics and reporting needs, utilizing Microsoft Fabric and Azure technologies for high-performance data processing. Your main responsibilities will include collaborating with cross-functional teams such as data analysts, data scientists, and business collaborators to understand data requirements and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Your expertise in data modeling, with a specific focus on data warehouse and lakehouse design, will be instrumental in designing and implementing data models, warehouses, and databases using various Azure services. In addition to data modeling, you will be responsible for developing ETL processes using tools like SQL Server Integration Services (SSIS) and Azure Synapse Pipelines to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also be a key aspect of your role. You will supervise and optimize data pipelines and workflows using Microsoft Fabric for real-time analytics and AI-powered workloads. Your proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms will be essential, along with experience in data integration and ETL tools like Azure Data Factory. Your in-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions, will be crucial for success in this role. Strong communication skills to convey technical concepts to both technical and non-technical stakeholders, as well as the ability to work independently and within a team environment, will also be required. To excel in this role, you will need 5-7 years of experience in Data Warehousing with on-premises or cloud technologies. Strong analytical abilities, proficiency in database management, SQL query optimization, and data mapping are essential skills. Proficiency in Excel, Python, SQL/Advanced SQL, and hands-on experience with Fabric components will be beneficial. The willingness to work flexible hours based on project requirements, strong documentation skills, and the ability to handle sensitive information with discretion are also important attributes for this role. Preferred qualifications include a Bachelor's degree or equivalent experience in Science, with a focus on MIS, Computer Science, Engineering, or related areas. Good interpersonal skills in English, agile certification, and experience with Oracle, SAP, or other ERP systems are preferred qualifications that set you apart. The ability to quickly learn new business areas, software, and emerging technologies, as well as the willingness to travel up to 20% as needed, will also be valuable in this role. At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives to drive growth and deliver business results. Our commitment to ongoing career development and an inclusive culture ensures you have the support to thrive and make a lasting impact. We offer competitive benefits plans, medical insurance, flexible time off, and opportunities for mentorship, training, and leadership development. Join Emerson and be part of a global leader in automation technology and software that helps customers in critical industries operate more sustainably and efficiently. We offer equitable opportunities, celebrate diversity, and embrace challenges to make a positive impact across various countries and industries. If you are looking to make a difference and contribute to innovative solutions, Emerson is the place for you. Let's go, together.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
At 3Pillar, the focus is on leveraging cutting-edge technologies to revolutionize industries through enabling data-driven decision-making. As a Data Engineer, you will play a crucial role within the dynamic team, actively contributing to exciting projects that redefine data analytics for clients, providing them with a competitive edge in their respective industries. If you are passionate about data analytics solutions that have a real-world impact, this position opens the door to the captivating world of Data Science and Engineering! You will be based at the office location in Vikroli/Churchgate, Mumbai. Responsibilities - Design, develop, and maintain data pipelines and workflows using PySpark to efficiently process large volumes of data. - Implement data ingestion, transformation, and storage solutions to meet business requirements. - Develop and optimize Python scripts and algorithms for data manipulation, analysis, and modeling tasks. - Design and manage database systems, including schema design, indexing, and performance optimization. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that align with business needs. - Perform data quality assessment, validation, and cleansing to ensure data accuracy and integrity. - Monitor and troubleshoot data pipeline issues, performance bottlenecks, and system failures. - Stay updated with emerging technologies and best practices in data engineering and analytics. Qualifications - Bachelor's degree in Computer Science, Engineering, or a related field. - Minimum of 3 years of experience in Database or Data Analytics. - At least 3 years of experience in data engineering, with expertise in Spark, Python, and database technologies. - Experience with PySpark for data processing and analysis tasks. - Proficiency in Python programming for scripting and automation. - Strong understanding of database concepts and hands-on experience with SQL and NoSQL databases. - Experience with data modeling, schema design, and ETL processes. - Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. - Excellent communication and interpersonal skills, with the ability to effectively convey technical concepts to non-technical stakeholders. - Experience with cloud platforms would be an advantage. If you meet the qualifications and are excited about the opportunity to work on impactful data analytics projects with a focus on innovation, apply for this job.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a highly skilled Automation and UI Specialist responsible for enhancing data-driven processes and web development capabilities. Your deep understanding of Google Sheets, Google Apps Script, JavaScript, and web development technologies will be crucial in automating workflows, creating insightful visual reports, and contributing to web application development to improve operational efficiency across teams. As an Automation and UI Specialist, your key responsibilities include automating data workflows and tasks using Google Apps Script and JavaScript to reduce manual intervention and increase efficiency. You will also contribute to the development of web applications using HTML, CSS, JavaScript, and Java. Your role involves creating visually appealing and user-friendly interfaces using HTML and CSS, collaborating with the team using Git for version control and code management, identifying and troubleshooting technical issues in a timely manner, effectively communicating with team members, stakeholders, and clients, and staying updated with the latest industry trends and technologies. To excel in this role, you must have a Bachelor's degree in a relevant field such as Computer Science, Information Systems, Mathematics, etc., and possess 3-5 years of experience. Proficiency in operating systems like Linux OS and basic Linux commands, a strong understanding of appscript/JavaScript, experience with HTML and CSS for UI development, and familiarity with Git for collaborative development are must-have technical skills. Additionally, experience with ETL processes and data integration, knowledge of JavaScript frameworks or libraries (e.g., Node.js, React), and an understanding of data governance, security, and quality assurance best practices are good-to-have skills. Soft skills such as strong communication skills, excellent problem-solving abilities, the capacity to manage multiple projects simultaneously, work effectively in a fast-paced, team-oriented environment, and a proactive attitude toward learning new technologies and staying updated on industry trends will be essential for success in this role.,
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
mumbai, maharashtra, india
On-site
India is among the top ten priority markets for General Mills, and hosts our Global Shared Services Centre. This is the Global Shared Services arm of General Mills Inc., which supports its operations worldwide. With over 1,300 employees in Mumbai, the center has capabilities in the areas of Supply Chain, Finance, HR, Digital and Technology, Sales Capabilities, Consumer Insights, ITQ (R&D & Quality), and Enterprise Business Services. Learning and capacity-building is a key ingredient of our success. Business Intelligence-II About General Mills We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of supply chain, digital & technology, innovation, technology & quality, consumer & market intelligence, sales strategy & intelligence, global shared services, finance shared services and Human Resources Shared Services.For more details check out Job Overview This position is a part of the Consumer and Market Insights, Performance and Consumer Analytics team. This is a global team supporting all our global businesses and enabling strategies through data-driven solutions. Team is based out of Minneapolis Minnesota and Mumbai India location. We are looking for experienced Tableau developer who also has a working knowledge of SQL necessary for data extraction, data validation and for creating basic data tables at the visual backend to maintain the data feeds for the visualization. This person will work with end business users and our data science/analytics team to make our data more easily useable. You will work with stakeholders throughout the world. As part of this team, you will be expected to develop self-service dashboards that will enable business users to have quick and easy access to different sources of data. This person will also work with team from other function and end clients in other countries mostly in the US but also in other parts of the world. Key Accountabilities . Design, develop, and maintain Tableau dashboards like Brand Health Tracker (BHT) and other reports that effectively communicate business trends and KPIs. . Work closely with business users and stakeholders to gather requirements and understand reporting needs. . Translate complex data sets into clear and actionable visuali zations. . Optimize dashboards for performance and usability. . Integrate Tableau dashboards with various data sources (SQL databases, cloud services, spreadsheets, etc.) . Conduct testing and validation to ensure accuracy and reliability of dashboards . Understanding of database behind the reports to provide requirements to database developers to have the right data and structure in Production environment that is necessary to build performant visualizations. . Leverage advance MS Excel functionality for data manipulation and validation work. . Document dashboard designs, data flows, and business logic. . Provide training and support to end-users on dashboard use and interpretation. Minimum Qualifications . Bachelor's degree from a government recognized institution with certification in Engineering or Computer Application (BE/B. Tech/BCA) . Four to six years of experience in Tableau desktop with understanding of Agile/Scrum environments or familiarity with tools like Jira, Confluence or Trello . Excellent communication and presentation skills, with attention to detail and a strong sense of design and layout in dashboards . Experience with Tableau Server and/or Tableau Online for publishing and sharing dashboards. . Proficiency in MS Excel, PowerPoint and SQL queries to improve load times and efficiency . Understanding of data modeling, data warehousing, and ETL processes . Ability to work with large and complex data sets and creating user guides, training material, and conduct sessions for business users on dashboard usage Preferred Qualifications - Master's degree from a government recognized institution with certification in Engineering or Computer Application (ME/M. Tech/MCA) - Advance MS Office packages and VBA Macro-based report automation will be preferred - Knowledge of or hands-on experience with other BI platforms like Power BI, Looker will be added advantage
Posted 2 weeks ago
6.0 - 8.0 years
0 Lacs
hyderabad, telangana, india
On-site
About this role: Wells Fargo is seeking a Information Security Engineering Senior Manager. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow In this role, you will: Manage a team, through less experienced managers, of Information Security Engineers that design, document, test, maintain and provide issue resolution recommendations for highly complex security solutions related to networking, cryptography, cloud, authentication or directory services, email, internet, applications, or endpoint security Engage more experienced information security and line of business management to identify, formulate and implement information security solutions and controls Lead a large, complex information security unit or a number of smaller specialized work units with direct impact to companywide information security objectives having high risk and complexity Manage security consulting on large projects for internal clients to ensure conformity with corporate information security policy and standards Set guidelines for compliance and risk management requirements for supported area and work with other stakeholders to implement key risk initiatives Oversee resource allocations to ensure commitments align with strategic objectives Manage implementation of information security such as availability, integrity, confidentiality, risk management, threat identification, modeling, monitoring, incident response, access management and business continuity Maintain a broad awareness of the state of information security across the enterprise and industry Influence change to information security policy, standards and procedures for systems, applications or tools Lead large, companywide projects and initiatives Represent the organization to regulators, industry groups and governmental agencies Interface with Information Security Industry Leaders, Financial industry Leaders, Analysts and Regulators Advise more experienced leadership or executive management on issues with high, critical impact on the company Manage allocation of people and financial resources for Information Security Architecture Develop and guide a culture of talent development to meet business objectives and strategy Required Qualifications: 6+ years of Information Security Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 3+ years of management or leadership experience Desired Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Experience in data engineering and in a management or leadership role, leading data engineering or BI teams. Strong understanding of data warehousing concepts, ETL processes, and data modeling techniques. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Specify which platforms are preferred. Proficiency in SQL and at least one programming language (e.g., Python, Java). Experience with BI tools (e.g., Tableau, Power BI). Excellent communication, interpersonal, and leadership skills. Proven ability to build and manage high-performing teams. Strong problem-solving and analytical skills. Experience with Agile development methodologies. Lead, mentor, and grow a team of data engineers, BI developers, and potentially data analysts. Foster a culture of collaboration, innovation, and continuous improvement. Conduct performance reviews, provide coaching and feedback, and support career development. Recruit, interview, and hire top talent to build a high-performing team. Manage team workload, prioritize projects, and ensure timely delivery. Define and implement best practices for data warehousing, ETL processes, data quality, and BI reporting. Stay up-to-date with the latest trends and technologies in data engineering and BI. Contribute to the overall data strategy for the company. Develop and execute the data engineering and BI roadmap, aligning with overall business objectives. Oversee the design, development, and maintenance of data pipelines, data warehouses, and BI dashboards. Ensure the scalability, reliability, and performance of data infrastructure. Collaborate with product managers and business stakeholders to define requirements and deliver data solutions that meet their needs. Manage projects effectively, ensuring on-time and within-budget delivery. Implement and maintain data governance policies and procedures. Partner closely with product managers, data scientists, and business stakeholders to understand their data needs. Communicate effectively with both technical and non-technical audiences. Present data insights and recommendations to senior management. Build strong relationships with other engineering teams. Posting End Date: 28 Aug 2025 We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 2 weeks ago
8.0 - 12.0 years
25 - 30 Lacs
mumbai
Work from Office
Job Title: Informatica CDQ Lead (Contract - Work From Office, Mumbai) We have an immediate requirement for an experienced Informatica CDQ Lead for a contract position in Mumbai. The ideal candidate should have expertise in Informatica Cloud Data Quality (CDQ) and strong data management skills. Key Responsibilities: Lead Informatica CDQ implementations for multiple projects. Develop and maintain data quality rules, scorecards, and dashboards. Collaborate with data architects and stewards to define solutions. Identify and resolve data quality issues to ensure accuracy. Work with business teams to understand requirements and provide solutions. Conduct root cause analysis and recommend improvements. Provide support and guidance to team members. Monitor and manage data quality performance with reports. Required Skills and Qualifications: 6+ years of experience with Informatica CDQ. Expertise in data quality solutions and governance. Strong knowledge of ETL processes and Informatica CDQ rules. Experience in leading data quality initiatives and teams. Familiarity with AWS, Azure, or Google Cloud is a plus. Excellent problem-solving and leadership skills.
Posted 2 weeks ago
1.0 - 5.0 years
7 - 10 Lacs
kolkata
Work from Office
Job Title : SSIS Developer Number of Positions : 5 Experience : 45 Years Location : Remote (Preferred : Ahmedabad, Gurgaon, Mumbai, Pune, Bangalore) Shift Timing : Evening/Night (Start time : 6 : 30 PM IST onwards) Job Summary We are seeking skilled SSIS Developers with 45 years of experience in developing and maintaining data integration solutions The ideal candidate will have strong expertise in SSIS and SQL, solid understanding of data warehousing concepts, and exposure to Azure data services This role requires clear communication and the ability to work independently during evening or night hours. Key Responsibilities Design, develop, and maintain SSIS packages for ETL processes. Write and optimize complex SQL queries and stored procedures. Ensure data accuracy, integrity, and performance across DWH systems. Collaborate with team members to gather and understand requirements. Work with Azure-based data platforms and services as needed. Troubleshoot and resolve data integration issues promptly. Document technical specifications and maintain version control. Required Skills Proficient in Microsoft SSIS (SQL Server Integration Services). Strong SQL skills, including performance tuning and debugging. Good understanding of data warehousing concepts and ETL best practices. Exposure to Azure (e.g., Data Factory, SQL Database, Blob Storage). Strong communication and collaboration skills. Ability to work independently during US-aligned hours. Preferred Qualifications Experience working in a remote, distributed team environment. Familiarity with agile methodologies and tools like JIRA, Git.
Posted 2 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
bengaluru
Work from Office
We are seeking an experienced Amazon Redshift Developer / Data Engineer to design, develop, and optimize cloud-based data warehousing solutions. The ideal candidate should have expertise in Amazon Redshift, ETL processes, SQL optimization, and cloud-based data lake architectures. This role involves working with large-scale datasets, performance tuning, and building scalable data pipelines. Key Responsibilities: Design, develop, and maintain data models, schemas, and stored procedures in Amazon Redshift. Optimize Redshift performance using distribution styles, sort keys, and compression techniques. Build and maintain ETL/ELT data pipelines using AWS Glue, AWS Lambda, Apache Airflow, and dbt. Develop complex SQL queries, stored procedures, and materialized views for data transformations. Integrate Redshift with AWS services such as S3, Athena, Glue, Kinesis, and DynamoDB. Implement data partitioning, clustering, and query tuning strategies for optimal performance. Ensure data security, governance, and compliance (GDPR, HIPAA, CCPA, etc.). Work with data scientists and analysts to support BI tools like QuickSight, Tableau, and Power BI. Monitor Redshift clusters, troubleshoot performance issues, and implement cost-saving strategies. Automate data ingestion, transformations, and warehouse maintenance tasks. Required Skills & Qualifications: 6+ years of experience in data warehousing, ETL, and data engineering. Strong hands-on experience with Amazon Redshift and AWS data services. Expertise in SQL performance tuning, indexing, and query optimization. Experience with ETL/ELT tools like AWS Glue, Apache Airflow, dbt, or Talend. Knowledge of big data processing frameworks (Spark, EMR, Presto, Athena). Familiarity with data lake architectures and modern data stack. Proficiency in Python, Shell scripting, or PySpark for automation. Experience working in Agile/DevOps environments with CI/CD pipelines.
Posted 2 weeks ago
3.0 - 4.0 years
5 - 9 Lacs
ahmedabad
Work from Office
Roles and Responsibility : Design and implement efficient data models in Power BI to support business requirements. Ensure data models are scalable, optimized, and maintainable. Develop and optimize DAX formulas for creating calculated columns, measures, and calculated tables. Implement complex calculations and business logic in DAX. Design and implement ETL processes using Power Query to transform and load data into Power BI. Ensure data quality and integrity through effective data cleansing techniques. Integrate Power BI with Power Apps and Power Automate to create end-to-end solutions. Develop custom applications and automate business processes. Design and manage data flows to streamline data integration and transformation. Optimize data flow processes for improved performance. Write and optimize SQL queries for data extraction and manipulation. Collaborate with the database team to ensure efficient data retrieval. Create and maintain comprehensive documentation for Power BI solutions, data models, and processes. Ensure documentation is up-to-date and accessible to the team. Collaborate with cross-functional teams to understand business requirements and deliver effective BI solutions. Provide guidance and support to junior team members.
Posted 2 weeks ago
5.0 - 10.0 years
11 - 16 Lacs
pune
Work from Office
Design, develop, and implement data solutions using Azure Data Stack components .Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.
Posted 2 weeks ago
4.0 - 5.0 years
12 - 14 Lacs
mumbai, new delhi, bengaluru
Work from Office
ETL Process Design: Designing and developing ETL processes using Talend for data integration and transformation. Data Extraction: Extracting data from various sources, including databases, APIs, and flat files. Data Transformation: Transforming data to meet business requirements and ensuring data quality. Data Loading: Loading transformed data into target systems, such as data warehouses or data lakes. Job Scheduling: Scheduling and automating ETL jobs using Talend's scheduling tools. Performance Optimization: Optimizing ETL workflows for efficiency and performance. Error Handling: Implementing robust error handling and logging mechanisms in ETL processes. Data Profiling: Performing data profiling to identify data quality issues and inconsistencies. Documentation: Documenting ETL processes, data flow diagrams, and technical specifications. Collaboration with Data Teams: Working closely with data analysts, data scientists, and other stakeholders to understand data requirements. Min 4 to Max 7yrs of Relevant exp. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
india
On-site
DESCRIPTION At Amazon, our goal is to be earth's most customer-centric company and to create a safe environment for both our customers and our associates. To achieve that, we need exceptionally talented, bright, dynamic, and driven people. If you'd like to help us build the place to find and buy anything online, this is your chance to make history. We are looking for a talented Business Intelligence Engineer to join the Trustworthy Shopping Experience Operations Analytics Team. Key job responsibilities - Design and implement scalable data architecture and analytics pipelines using AWS services, creating robust ETL processes and optimized data models to support enterprise reporting needs - Build automated, self-service reporting capabilities and dashboards using BI tools like QuickSight, enabling stakeholders to independently access insights and address business questions - Partner with cross-functional teams (BAs, Data Engineers, Product Managers) to gather requirements, translate business needs into technical solutions, and deliver high-quality data products - Develop and optimize complex SQL queries and stored procedures while implementing data quality frameworks to ensure accuracy and reliability of analytics solutions - Conduct advanced statistical analysis and create analytical models to uncover trends, develop insights, and support data-driven decision making - Identify and implement process improvements through automation, code optimization, and integration of generative AI capabilities to enhance BI processes and reporting efficiency - Lead technical discussions, provide mentorship to junior team members, and present solutions to stakeholders while staying current with emerging technologies and best practices A day in the life As a Business Intelligence Engineer in TSE Operations Analytics, you'll develop analytical solutions to provide insights into support operations and measure global technical support initiatives. You'll transform operational data into actionable insights for TSE leaders and engineers worldwide, managing critical datasets for support metrics, productivity, and customer satisfaction. Key responsibilities include building real-time dashboards, automated reporting systems, and data quality frameworks. You'll work with stakeholders to implement analytics solutions and drive data-driven decision-making. The role involves optimizing query performance and translating findings into recommendations to improve support efficiency and customer experience. About the team We are a dynamic analytics team within TSE (Trustworthy Shopping Experience) Operations, combining Business Intelligence Engineers, Data Engineers, and Business Analysts. Our mission is twofold: protecting customers from unsafe or non-compliant products while enabling sellers to grow their businesses confidently on Amazon. We build scalable BI solutions and data-driven insights that streamline compliance processes, improve operational efficiency, and enhance the seller experience. Our team focuses on delivering analytical solutions that drive operational excellence, uncover emerging risks, reduce investigation errors, and optimize the customer-seller trust ecosystem. Through advanced analytics and BI tools, we help shape the future of Amazon's trustworthy shopping experience. BASIC QUALIFICATIONS - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Data Engineer at Attentive.ai, you will play a pivotal role in enabling data accessibility, migration, and intelligence across different products and teams. You will be responsible for building the entire architecture and processes for streamlining data migration for our Accelerate product. You will also be involved in defining the vision for collecting internal data for various business, product, and engineering use cases. We expect you to be well balanced - super smart, quality and delivery focused with a high-level business acumen. You should be able to see nuances and intricacies that others might not. Roles & Responsibilities: - Design, develop, and implement solutions to migrate customers" existing data to Accelerate. - Work closely with Accelerate engineering teams to gain an understanding of the internal schema. - Define and adopt engineering best practices. - Collaborate with cross-functional teams (customer success, sales, and product) to understand data requirements and ensure seamless data integration. - Set and execute the data vision strategy for internal data movement, ensuring data consistency, quality, and security. - Optimize and maintain data pipelines, ensuring high performance and scalability. - Monitor and troubleshoot data issues, providing timely resolutions. - Mentor junior data analysts and contribute to the growth of the data engineering team. Skills & Requirements: - Minimum 5+ years of experience in building data platforms and data engineering solutions and Data Architecture. - Proven experience in designing and building data migration platforms, including planning, execution, and validation of data migrations. - Proficiency in SQL and experience with data modeling, ETL processes, and data warehousing solutions. - Knowledge of popular data migration tools, ETL technologies, and frameworks (Airflow, Apache Beam, KNIME, etc.). - Strong programming skills in Python, Java, or Scala. - Experience with cloud platforms (GCP preferred) and big data technologies (Hadoop, Big Query, etc.). - Excellent problem-solving skills and attention to detail. - Strong communication skills and the ability to work collaboratively in a fast-paced environment. - Emotional intelligence and remarkable stakeholder management abilities. Job Type: Full-time Benefits: - Provident Fund Schedule: - Day shift - Morning shift Yearly bonus Work Location: In person,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
chandigarh
On-site
The ideal candidate should have strong proficiency in Python with experience in Pandas. You should have a minimum of 2-3 years of practical experience with data projects and be well-versed in SQL and modern database technologies. Proficiency in version control systems, statistical methods, and practical application of machine learning concepts, especially in NLP, is required. Experience with API development and integration, as well as expertise in the Azure cloud platform (AWS/GCP knowledge is a plus), is essential. You should also be familiar with data visualization tools like matplotlib and seaborn, data pipeline development, ETL processes, data warehouse design, and implementation. Proficiency in building and maintaining data pipelines, data modeling, schema design, and experience with ETL tools like Azure Data Factory is necessary. Knowledge of data lake architectures, data cleaning, preprocessing skills, and big data technologies such as Spark and Databricks is a plus. In addition, you should have the ability to optimize database queries and data processes, experience with real-time data processing, and strong analytical skills focusing on data quality and validation. Experience with prompt engineering, LLM application development, AI model deployment, integration, and knowledge of responsible AI practices are desired. A Bachelor's or Master's degree in Computer Science, Data Science, or a related field is required, along with evidence of continuous learning such as certifications and online courses. Previous healthcare data experience and Azure certifications are highly valued. You must grasp data privacy and security principles, understand GDPR fundamentals, data protection principles, and have knowledge of handling sensitive personal data. Strong English communication skills, experience with remote work tools, and asynchronous communication are necessary, along with the ability to work across time zones and be a self-motivated, independent worker with strong documentation habits. Collaboration and communication skills are essential, including experience working with development teams, code review, technical specification writing, proactive communication, problem-solving mindset, and the ability to explain technical concepts to non-technical stakeholders. Experience with project management tools like Jira and Trello is beneficial. This is a full-time position with benefits including Provident Fund. The work location is in person, and the expected start date is 01/03/2025.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are an experienced Quality Engineer with 6 to 8 years of experience (B2 level) and possess a strong skillset in testing analytical systems. Joining the RE&ILS Operations value stream, you will play a crucial role in ensuring the quality and reliability of data solutions within the Operations Value Stream. Your responsibilities include developing and executing comprehensive test plans and test cases for data solutions such as data pipelines, ETL processes, and data warehouses. You will conduct data validation and verification to guarantee data accuracy, completeness, and consistency. It is essential for you to identify, document, and track defects and issues while collaborating with development teams to resolve them. Collaboration with data engineers, data scientists, and other stakeholders is necessary to understand data requirements fully and ensure that testing covers all necessary scenarios. You will automate data testing processes using appropriate tools and frameworks, as well as conduct performance testing to ensure that data solutions can handle expected workloads. Participating in code reviews and providing feedback on data quality and testing practices will be part of your routine. You must continuously improve testing processes and methodologies to enhance the efficiency and effectiveness of data testing. Requirements for this role include proven experience in data testing and quality engineering, a strong understanding of data engineering practices including ETL processes, data pipelines, and data warehousing. Proficiency in SQL and experience with database management systems like MS SQL Server is essential. Knowledge of SSIS, SSAS, data testing tools and frameworks (e.g., pytest, dbt), cloud data platforms (e.g., Snowflake, Azure Data Factory), strong analytical and problem-solving skills, excellent communication and collaboration skills, as well as the ability to work independently and as part of a team are also required.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
Decimal Business Solutions Pte Ltd is an IT Services Company specializing in SAP ERP Implementation, Mobile and Cloud solutions, Product Development, AMS, and IT Consulting Services. Our team possesses expertise across various industry verticals including Financial Services, Manufacturing, Logistics, Supply Chain, Marketing & Sales, Hi-Tech, and IT Consulting services. We aim to be the most innovative, customer-centric solution provider in IT Consulting and IT Enabled Services. Decimal is well-recognized for its SAP Consulting Services, Migration Services, Implementation Services, and more, striving to deliver exceptional IT service with highly skilled resources. This is a full-time on-site role for an SAP S/4 HANA Database Architect, located in INDIA (Offshore). The SAP S/4 HANA Database Architect will manage database administration, develop data models, design databases, execute ETL processes, and architect data solutions. Daily tasks include ensuring database performance, security, and availability, as well as collaborating with other IT teams to optimize data management practices and support SAP S/4 HANA environments. The ideal candidate should have 10 to 15 years of experience and be proficient in ABAP Knowledge to delve into code details for debugging. Understanding business logic & queries is essential, along with working on performance queries optimization, large database setup, table partitioning, archiving concepts, and data distribution techniques. Experience with SAP S/4 HAANA Database Administration and Design, Data Modeling, SAP ABAP code, ETL processes, and SAP technologies is required. A Bachelor's degree in Computer Science, Information Technology, or related field is necessary. Excellent problem-solving, analytical, and communication skills are essential for collaborating effectively with team members. Experience in managing large-scale database environments and certification will be advantageous.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Senior Integration Developer, you will play a crucial role in solving complex integration challenges and establishing reliable connections between enterprise systems, applications, and data. Your passion for creating seamless experiences through APIs, microservices, and cutting-edge technology will drive success in this role. Your responsibilities will include designing and developing integrations for platforms like HCL's Marketing Platform (Unica) and Customer Data Platforms (CDP). You will also create secure, efficient, and scalable APIs to connect systems, data sources, and middleware using microservices and best practices. Automation of processes through writing automated tests and monitoring scripts will ensure the reliability and high performance of your solutions. Analyzing business requirements and translating them into technical solutions that deliver value will be a key aspect of your role. You will monitor integration performance, troubleshoot issues promptly, and collaborate with teams to explore new integration tools and technologies for process improvement. Your experience of 5+ years in hands-on development with Python or Java, focusing on enterprise application and data integration, will be invaluable. Familiarity with integration tools like Postman, Swagger, and API gateways, along with a solid understanding of protocols such as REST, JSON, XML, and SOAP, will enhance your capabilities. Experience with message brokers like Apache Kafka, RabbitMQ, and ETL processes for data ingestion and transformation will be essential. Additionally, having experience with Customer Data Platforms such as TreasureData, Tealium, or Salesforce, familiarity with cloud platforms like AWS, GCP, Azure, or OpenShift, and certifications in Java or cloud technologies will be considered a bonus. Your problem-solving skills, coupled with a passion for innovation, will drive your success in this role. Working in an agile environment and collaborating with cross-functional teams will be part of your daily routine. Your ability to mentor and empower team members will contribute to a positive work culture. Ideally, you hold a degree in Computer Science, IT, or a related field, which will provide you with a strong foundation for this role. Joining this team will offer you the opportunity to work on exciting projects, challenge your skills, and grow both personally and professionally. Your contributions will make a tangible impact by connecting systems and creating seamless user experiences. The supportive work culture, flexibility, and room for innovation make this role even more rewarding. If you are excited by the prospect of this work, we look forward to hearing from you and discussing how you can contribute to our team.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
We are seeking a talented Power BI Developer to become a valuable member of our team. Your primary role will involve designing, creating, and maintaining business intelligence solutions using Power BI. You will collaborate closely with stakeholders to comprehend business requirements and transform them into technical solutions that provide actionable insights. Your responsibilities will include developing, implementing, and managing Power BI dashboards, reports, and data models. You will engage with stakeholders to collect and analyze requirements, connect to various data sources, and prepare data for reporting purposes. It will also be your duty to optimize Power BI solutions for performance and scalability, ensuring data accuracy and integrity in reports and dashboards. Troubleshooting and resolving any issues related to Power BI solutions will be part of your routine. Additionally, you will provide training and support to end-users and keep abreast of the latest advancements in Power BI and related technologies. To excel in this role, you must possess proficiency in Power BI, including DAX and Power Query, along with hands-on experience in data modeling and report development. A strong understanding of SQL and database management systems is essential, as well as familiarity with ETL processes and data integration. Excellent problem-solving skills, attention to detail, and effective communication and collaboration abilities are crucial. You should be able to work both independently and within a team environment. Preferred qualifications include experience with Azure Data Services or other cloud platforms, knowledge of Python or R for data analysis, certification in Power BI or related tools, and familiarity with other BI tools like Tableau or Qlik. Join our team at Bristlecone, a prominent provider of AI-powered application transformation services for the connected supply chain. We equip our customers with speed, visibility, automation, and resiliency to adapt and succeed in a constantly changing environment. Our innovative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement, and Digitalization are structured around essential industry pillars and delivered through a comprehensive range of services spanning digital strategy, design and build, and implementation across various technology platforms. Bristlecone, ranked among the top ten leaders in supply chain services by Gartner, is headquartered in San Jose, California, with a global presence across North America, Europe, and Asia, and a team of over 2,500 consultants. As part of the $19.4 billion Mahindra Group, we are committed to providing equal opportunities to all. As part of your responsibilities, you are required to understand and adhere to Information Security policies, guidelines, and procedures for safeguarding organizational data and Information System. Participation in information security training is mandatory, and you should promptly report any suspected security or policy breaches to the InfoSec team or appropriate authority (CISO). You are expected to comply with additional information security responsibilities associated with your job role.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be joining our team as a skilled PowerBI Senior Developer Analyst, where your primary responsibility will involve designing, developing, and maintaining business intelligence solutions using Microsoft PowerBI. Your role will demand a strong analytical mindset, exceptional problem-solving skills, and the capacity to collaborate effectively with cross-functional teams to provide data-driven insights. Your main responsibilities will include designing, developing, and deploying PowerBI reports and dashboards that offer actionable insights to stakeholders. You will work closely with business analysts and stakeholders to gather requirements and translate them into technical specifications. It will be essential for you to optimize PowerBI solutions for performance and scalability, ensuring data accuracy and integrity through data validation and quality checks. In addition, you will be expected to provide training and support to end-users on PowerBI tools and functionalities. Staying updated with the latest PowerBI features and industry trends will be crucial to continuously enhance reporting capabilities. Your role will also involve troubleshooting and resolving issues related to PowerBI reports and dashboards, as well as documenting processes, methodologies, and best practices for PowerBI development. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 8-10 years of overall experience, including a minimum of 5 years as a PowerBI Developer or in a similar role. Possessing relevant MS PowerBI Developer Certifications will be advantageous. You should demonstrate strong proficiency in PowerBI, particularly in DAX and Power Query, as well as experience in data modeling, data warehousing, and ETL processes. Familiarity with SQL and database management systems is preferred, and experience with other BI tools such as Tableau or Qlik is a plus. Moreover, you should exhibit excellent analytical and problem-solving skills, along with strong communication and interpersonal abilities. Your capacity to work both independently and as part of a team will be essential to succeed in this role. It is important to continuously enhance your skills and knowledge to keep pace with the evolving landscape of PowerBI and business intelligence technologies.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Supply Chain Transformation Consultant at EFESO Management Consultants, you will play a pivotal role in advising clients on supply chain strategy, business transformation, and digitalization. You will work closely with clients to automate and optimize their supply chain management processes, acting as a bridge between business requirements and IT solutions. Your responsibilities will include shaping business solutions, defining digitalization requirements, and supporting end-to-end project delivery with a strong focus on customer interaction. You are a self-motivated problem solver with a passion for combining technical, supply chain, and business perspectives. Your decision-making is driven by finding the best solutions for clients, rather than personal ego. By inspiring customers daily, you contribute to achieving our collective goals and building long-lasting relationships as a trusted advisor. To excel in this role, you should have a minimum of 3-5 years of relevant work experience and a solid background in end-to-end supply chain management. Your expertise should include supply chain process analysis, optimization, and a proficient understanding of software systems such as ERP, advanced planning systems, and BI tools. Experience with technical domains like ETL processes, scripting languages, BI tools, and proficiency in MS Excel is essential. Additionally, strong consulting skills, project experience, and the ability to collaborate effectively with customers and technology partners are key requirements. At EFESO, you will have the opportunity to become a thought leader in digital supply chain transformation. You will work in a supportive environment with a great team culture, flexible work hours, and respect for your ideas. We offer attractive remuneration, paid maternity and paternity leave, company-sponsored insurance plans, and opportunities for paid certifications in relevant technology areas. Join us on a journey to revolutionize supply chain management and shape your own success story within our innovative and dynamic team.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
You are an exceptional individual who is passionate about utilizing your skills as a Tableau Administrator with experience in AWS to contribute to our team. Your main responsibility will be to manage and optimize our Tableau Server environment hosted on AWS, ensuring efficient operation, data security, and seamless integration with other data sources and analytics tools. Your key responsibilities will include managing, configuring, and administering Tableau Server on AWS, monitoring server activity/performance, conducting regular system maintenance, troubleshooting issues, collaborating with data engineers and analysts to optimize data sources and dashboard performance, implementing security protocols, automating monitoring and server management tasks using AWS and Tableau APIs, designing and developing complex Tableau dashboards, providing technical support and training to Tableau users, and staying updated on the latest Tableau and AWS features and best practices to recommend and implement improvements. To excel in this role, you should have proven experience as a Tableau Administrator with strong skills in Tableau Server and Tableau Desktop, experience with AWS services relevant to hosting and managing Tableau Server, familiarity with SQL and working with various databases, knowledge of data integration, ETL processes, and data warehousing principles, strong problem-solving skills, the ability to work in a fast-paced environment, excellent communication and collaboration skills, and relevant certifications in Tableau and AWS would be a plus. As a Tableau Administrator, you will be responsible for various tasks including server administration, user management, security implementation, data source connections, license management, backup and recovery planning, performance optimization, scaling resources, troubleshooting, version upgrades, monitoring and logging, training and support, collaboration with various teams, documentation maintenance, governance implementation, integration with other systems, usage analytics, and staying current with Tableau updates and best practices. Join our team at NTT DATA, a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. Be part of a diverse team with experts in more than 50 countries and contribute to the development, implementation, and management of digital and AI infrastructure. Embrace the opportunity to move confidently and sustainably into the digital future with NTT DATA. Visit us at us.nttdata.com.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the team at Viraaj HR Solutions, you will play a crucial role in designing, developing, and maintaining ETL processes using Talend Open Studio. Your responsibilities will include collaborating with cross-functional teams to gather requirements, executing data migration and transformation processes efficiently, and developing test cases for data integration. Monitoring and improving data quality metrics to ensure accuracy, managing troubleshooting and debugging of ETL processes, and documenting technical specifications will be essential aspects of your role. You will be expected to implement best practices in data management and analytics, assist in the extraction, transformation, and loading of large datasets, and ensure compliance with data governance and protection policies. Providing support for production issues, conducting performance tuning and optimization of ETL processes, and collaborating with the data warehouse team to ensure optimal data architecture will also be part of your responsibilities. Your qualifications should include a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Talend Open Studio Developer or in a similar ETL development role. Strong understanding of data integration and ETL best practices, proficiency in SQL and database management systems, and experience with data quality tools and methodologies are required. Familiarity with data visualization tools, excellent problem-solving and analytical skills, and the ability to work in a team-oriented environment are also important. Effective communication skills, both written and verbal, knowledge of agile methodologies and project management tools, and experience with cloud technologies and environments will be beneficial. Strong attention to detail, commitment to delivering high-quality work, ability to manage multiple tasks and meet deadlines, understanding of data governance and compliance regulations, and willingness to continuously learn and adapt to new technologies are necessary for success in this role. If you are passionate about driving success through innovative solutions, staying updated on industry trends and advancements in data integration technologies, and contributing to continuous improvement initiatives, we invite you to join our dynamic team at Viraaj HR Solutions.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
kolkata, west bengal
On-site
You are a detail-oriented Junior Data Analyst with 1-3 years of experience, joining a team in Kolkata. Your role involves collecting, cleaning, and analyzing data to identify trends, generating reports and visualizations, and ensuring data accuracy and consistency. Collaborating with different departments, you support data-driven decision-making, provide key insights and recommendations, and automate data processes. You maintain data integrity, conduct market research, and collaborate with IT teams for data storage enhancements. Staying updated on industry trends, you possess a Bachelor's degree in a related field and proficiency in SQL, data visualization tools, Excel, and basic Python/R skills. Strong problem-solving, analytical, and communication skills are essential, along with attention to detail and ability to work independently and in a team environment. Preferred skills include experience with large datasets, knowledge of data warehousing and ETL processes, exposure to cloud-based data platforms, and familiarity with data governance and compliance standards.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The role of Manager, Data Quality within the Data Platforms and Engineering Services (DPES) program is crucial for ensuring the integrity, consistency, and reliability of the data warehouse. As the Data Quality Manager, you will be leading a team tasked with monitoring and enhancing data quality across pipelines to align with business requirements and compliance standards. Your primary responsibilities will include team leadership and collaboration. This involves managing a team of software engineers and data quality analysts, offering guidance and training to drive high performance. Additionally, you will conduct regular data quality reviews and present the findings to senior management. Collaboration with Product Management Technical, DPES partners, and business stakeholders is essential to support the roadmap and address evolving needs effectively. In terms of Data Quality Management and Improvement, you will be responsible for establishing and maintaining data quality standards, policies, and procedures for the data warehouse. Defining and tracking data quality metrics and KPIs to identify areas for enhancement will be key. Moreover, you will implement and enhance data quality tools to strengthen validation and monitoring processes. Managing data quality projects to ensure timely delivery and leading the team in monitoring and reporting data quality trends, issues, and improvements are integral aspects of the role. Root Cause Analysis & Resolution will also be a critical responsibility. Guiding the team in investigating and resolving data discrepancies promptly and identifying recurring data quality issues to address root causes will be part of your duties. Qualifications required for this role include a Bachelor's degree in Computer Science, Information Technology, Data Management, or a related field. A minimum of 5 years of experience in data management, with at least 2 years in a leadership role, is essential. Strong expertise in Data Warehousing, ETL processes, and data quality management is necessary, along with experience in big data platforms such as Hadoop, Oracle, and Snowflake. Proficiency in data quality frameworks, methodologies, and tools, as well as a deep understanding of relational databases (SQL) and data warehousing concepts, are also required. Excellent communication, leadership, and problem-solving skills are vital for success in this role. As a member of the team working with Mastercard assets, information, and networks, it is your responsibility to ensure information security. This involves abiding by Mastercard's security policies and practices, maintaining the confidentiality and integrity of the accessed information, reporting any suspected information security violations or breaches, and completing all mandatory security trainings as per Mastercard's guidelines.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |