Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
4 - 8 Lacs
Hyderabad
Work from Office
The Informatica Powercentre role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Informatica Powercentre domain.
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai
Work from Office
The Informatica Powercentre role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Informatica Powercentre domain.
Posted 1 month ago
1.0 - 3.0 years
7 - 12 Lacs
Bengaluru
Hybrid
Job Summary: We are looking for a motivated and detail-oriented Junior Data Engineer with experience of 1-2 years to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You'll play a key role in helping to ingest, process, and transform data to support various business and analytical needs. Key Responsibilities: Assist in the design, development, and maintenance of scalable and efficient data pipelines. Write clean, maintainable, and performance-optimized SQL queries. Develop data transformation scripts and automation using Python. Support data ingestion processes from various internal and external sources. Monitor data pipeline performance and help troubleshoot issues. Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency. Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP as applicable). Document technical processes and pipeline architecture. Core Skills Required: Proficiency in SQL (data querying, joins, aggregations, performance tuning). Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy). Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred). Understanding of relational databases and data warehouse concepts. Familiarity with version control systems like Git. Preferred Qualifications: Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.) Familiarity with data modeling and data integration concepts. Basic knowledge of CI/CD practices for data pipelines. Bachelor’s degree in Computer Science, Engineering, or related field. NOTE: IT IS MANDATORY TO GIVE ONE TECHNICHAL ROUND FACE TO FACE. And the candidate should be based out of Bangalore only. Outstation candidates kindly don't apply.
Posted 1 month ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Role & responsibilities 1. Senior Snowflake Developer-Experience- 8+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office Notice Period- immediate to 15 days CTC: 37 Lakhs JD: Summary ThoughtFocus is looking for a senior snowflake developer for our NYC and London based financial services client operating in Public/Private Loans, CLOs, and Long/Short Credit. You will play a pivotal role in accomplishing the successful delivery of our strategic initiatives. You will be responsible for developing solutions using technologies like Snowflake, Coalesce & Fivetran. Location: Bengaluru, India Requirements: IT experience of 8+ years with a minimum of 3+ years of experience as a Snowflake Developer. Design, develop, and optimize Snowflake objects such as databases, schemas, tables, views, and store procedures. Expertise in Snowflake utilities such as Snow SQL, Snow Pipe, Stages, Tables, Zero Copy Clone, Streams and Tasks, Time travel, data sharing, data governance, and row access policy. Experience in migrating data from Azure Cloud to Snowflake, ensuring data integrity, performance optimization, and minimal disruption to business operations. Experience in Snow pipe for continuous loading and unloading data into Snowflake tables. Experience in using the COPY, PUT, LIST, GET, and REMOVE commands. Experience in Azure integration and data loading (Batch and Bulk loading). Experience in creating the System Roles & Custom Roles and Role Hierarchy in Snowflake. Expertise in masking policy and network policy in Snowflake. Responsible for designing and maintaining ETL tools (Coalesce & Fivetran) that include extracting the data from the MS-SQL Server database and transforming the data as per Business requirements. Extensive experience in writing complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes, and Exception Handling using MS-SQL Server (TSQL). Effective communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. What's on offer Competitive and above market salary. Hybrid work schedule. Opportunity to get exposure and technology experience in global financial markets. Education Bachelor's degree in Computer Science / IT / Finance / Economics or equivalent. 2. Please find the below Lead Snowflake JD Location: Bangalore( UK Shift) 3 days work from Office CTC:45 Lakhs 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. • Strong communication skills and interpersonal skills. • Ability to influence and collaborate effectively with cross-functional teams. • Exceptional problem-solving and decision-making abilities. • Experience in working in an agile development environment. • Experience working in a fast-paced, dynamic environment. • Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, functions, and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL, Snowflake 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools Should be able to define and enforce Best Practices Good communication skills to be able to discuss and deliver requirements effectively with the client Preferred candidate profile
Posted 1 month ago
5.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
What youll do Following are high level responsibilities that you will play but not limited to: Analyze Business Requirements. Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema. Transformation of Data in Power BI/SQL/ETL Tool. Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas. Experience writing SQL Queries and stored procedures. Design effective Power BI solutions based on business requirements. Manage a team of Power BI developers and guide their work. Integrate data from various sources into Power BI for analysis. Optimize performance of reports and dashboards for smooth usage. Collaborate with stakeholders to align Power BI projects with goals. Knowledge of Data Warehousing(must), Data Engineering is a plus What youll bring B. Tech computer science or equivalent Minimum 5+ years of relevant experience
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
2.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : O9 Solutions Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that enhance operational efficiency. Roles & Responsibilities:Play the integration consultant role on o9 implementation projects. Understand o9 platforms data model (table structures, linkages, pipelines, optimal designs) for designing various planning use cases. Review and analyze the data provided by customer along with its technical/functional intent and inter-dependencies. Participate in the technical design, data requirements gathering, making recommendations in case of inaccurate or missing data. Work on designing and creating batch schedules based on frequency and configuration settings for daily/weekly/quarterly/yearly batches. E2E integration implementation from partner system to o9 platform Technical Skills: Must have minimum 3 to 7 years of experience on SQL, PySpark, Python, Spark SQL and ETL tools. Proficiency in database (SQL Server, Oracle etc ).Knowledge of DDL, DML, stored procedures.Good to have experience in Airflow, Dalta Lake, Nifi, Kafka. At least one E2E integration implementation experience will be preferred. Any API based integration experience will be added advantageProfessional Skills: Proven ability to work creatively and analytically in a problem-solving environment.Proven ability to build, manage and foster a team-oriented environment.Excellent problem-solving skills with excellent communication written/oral, interpersonal skills.Strong collaborator- team player- and individual contributor. Educational QualificationBE/BTech/MCA/Bachelor's degree/masters degree in computer science and related fields of work are preferred. Additional Information:The candidate should have minimum 7.5 years of experience in O9 Solutions.This position is based in Pune.A 15 years full time education is required.Open to travel - short / long term Qualification 15 years full time education
Posted 1 month ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing project progress, coordinating with teams, and ensuring successful application delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Ensure successful application delivery- Coordinate with cross-functional teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services- Strong understanding of data integration and ETL processes- Experience in leading application development projects- Knowledge of SAP BusinessObjects platform- Hands-on experience in configuring and optimizing applications Additional Information:- The candidate should have a minimum of 12 years of experience in SAP BusinessObjects Data Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 5.0 years
2 - 5 Lacs
Gurugram
Work from Office
Skill required: Retirement Solutions - Retirement Planning Services Designation: Customer Service Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Lead conversion design sessions with the client and produce design documents and mapping/transformation documents.Transitioning end to end DC 401(k), 403(b) retirement plans from prior service provider/recordkeeper to current service provider (current organization) which involved transfer of plan assets and participant records, conversion data files.Research, analyze and recommend data conversion strategies for complex retirement plans, analyze client data in the format it is received, identify data deficiencies, define remediation of deficiencies, and construct a statement of work that properly outlines the conversion process to reformat into data requirements.Manage and lead the migration and testing of static data and transaction data for the plan conversion.Retirement solution is a comprehensive process to understand how much money you will need when you retire. Retirement solution also helps you identify the bestCovers the full range of services needed throughout a plans life, including plan development & enhancement, sales & marketing, plan sponsor/institutional client onboarding/management, participant enrollment/management, sponsor and member servicing & reporting. Their products consist of individual retirement accounts (Roth IRA), college savings accounts, guaranteed investment contracts, fixed & variable deferred annuities (qualified & non qualified), as well as corporate retirement funds. What are we looking for 3+ years of experience in US Retirement Services domain managing services for Defined Contribution plansMinimum 3 years of experience in a data analyst position in the Plan conversion (defined contributions) team with basic professional MS Access Database & SQL query experience preferredWorking knowledge of Microsoft Access, Excel, SQL and other ETL tools is required. Ability to manage large data sets (census files, financial/payroll files) for Defined Contribution plans 401(k), 403(b), 457Working knowledge of Microsoft Access, Excel, SQL and other ETL tools is required.Demonstrated aptitude in data, metrics, analysis and trends and applied knowledge of measurement, statistics and program evaluation.Basic level of understanding of the proprietary systems, administration services, and the related data services.Basic knowledge of conversion reconciliation methodology.Strong organizational and detail orientation skills, ability to work well with both technical and non-technical resources.Proven ability to work independently and with a team in a results/deadline driven environment. Roles and Responsibilities: Lead conversion design sessions with the client and produce design documents and mapping/transformation documents.Transitioning end to end DC 401(k), 403(b) retirement plans from prior service provider/recordkeeper to current service provider (current organization) which involved transfer of plan assets and participant records, conversion data files.Research, analyze and recommend data conversion strategies for complex retirement plans, analyze client data in the format it is received, identify data deficiencies, define remediation of deficiencies, and construct a statement of work that properly outlines the conversion process to reformat into data requirements.Manage and lead the migration and testing of static data and transaction data for the plan conversion.Effectively communicate the data requirements with the client, record-keeper or project team, negotiate or consult the client or record keeper to best practices and perform a code-based programming solution to achieve a successful data conversion. You are expected to take ownership of each conversion with high quality and consistent on-time results.Have full accountability for the data conversion development life cycle and methodology. This includes project requirements, client acceptance, time line creation, implementation, testing, production activities.Build files that transform massive amounts of data into the client Retirement product requirements and formatting.Develop reports for internal and external business partners using SQL Server, MS Access, Cognos, and Discovery.Perform data migration audits, reconciliation, and exception reporting as necessary.Collaborate with the record-keeper, internal project management, and client data end users. Qualification Any Graduation
Posted 1 month ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process effectively- Ensure seamless communication among team members and stakeholders- Provide guidance and mentorship to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI)- Strong understanding of data visualization tools such as Tableau or Power BI- Hands-on experience in implementing various machine learning algorithms- Solid grasp of data munging techniques for data quality and integrity Additional Information:- The candidate should have a minimum of 12 years of experience in Microsoft Power Business Intelligence (BI)- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 15 Lacs
Chennai
Work from Office
About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.
Posted 1 month ago
5.0 - 7.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Senior Data Engineer / Technical Lead Location: Bangalore Employment Type: Full-time Role Summary We are seeking a highly skilled and motivated Senior Data Engineer/Technical Lead to take ownership of the end-to-end delivery of a key project involving data lake transitions, data warehouse maintenance, and enhancement initiatives. The ideal candidate will bring strong technical leadership, excellent communication skills, and hands-on expertise with modern data engineering tools and platforms. Experience in Databricks and JIRA is highly desirable. Knowledge of supply chain and finance domains is a plus, or a willingness to quickly ramp up in these areas is expected. Key Responsibilities Delivery Management Lead and manage data lake transition initiatives under the Gold framework. Oversee delivery of enhancements and defect fixes related to the enterprise data warehouse. Technical Leadership Design and develop efficient, scalable data pipelines using Python, PySpark , and SQL . Ensure adherence to coding standards, performance benchmarks, and data quality goals. Conduct performance tuning and infrastructure optimization for data solutions. Provide code reviews, mentorship, and technical guidance to the engineering team. Collaboration & Stakeholder Engagement Collaborate with business stakeholders (particularly the Laboratory Products team) to gather, interpret, and refine requirements. Communicate technical solutions and project progress clearly to both technical and non-technical audiences. Tooling and Technology Use Leverage tools such as Databricks, Informatica, AWS Glue, Google DataProc , and Airflow for ETL and data integration. Use JIRA to manage project workflows, track defects, and report progress. Documentation and Best Practices Create and review documentation including architecture, design, testing, and deployment artifacts. Define and promote reusable templates, checklists, and best practices for data engineering tasks. Domain Adaptation Apply or gain knowledge in supply chain and finance domains to enhance project outcomes and align with business needs. Skills and Qualifications Technical Proficiency Strong hands-on experience in Python, PySpark , and SQL . Expertise with ETL tools such as Informatica, AWS Glue, Databricks , and Google Cloud DataProc . Deep understanding of data warehousing solutions (e.g., Snowflake, BigQuery, Delta Lake, Lakehouse architectures ). Familiarity with performance tuning, cost optimization, and data modeling best practices. Platform & Tools Proficient in working with cloud platforms like AWS, Azure, or Google Cloud . Experience in version control and configuration management practices. Working knowledge of JIRA and Agile methodologies. Certifications (Preferred but not required) Certifications in cloud technologies, ETL platforms, or relevant domain (e.g., AWS Data Engineer, Databricks Data Engineer, Supply Chain certification). Expected Outcomes Timely and high-quality delivery of data engineering solutions. Reduction in production defects and improved pipeline performance. Increased team efficiency through reuse of components and automation. Positive stakeholder feedback and high team engagement. Consistent adherence to SLAs, security policies, and compliance guidelines. Performance Metrics Adherence to project timelines and engineering standards Reduction in post-release defects and production issues Improvement in data pipeline efficiency and resource utilization Resolution time for pipeline failures and data issues Completion of required certifications and training Preferred Background Background or exposure to supply chain or finance domains Willingness to work during morning US East hours Ability to work independently and drive initiatives with minimal oversight Required Skills Databricks,Data Warehousing,ETL,SQL
Posted 1 month ago
1.0 - 5.0 years
5 - 9 Lacs
Thane
Work from Office
Skills: SQL, NoSQL Databases, PL/SQL, Python, ETL Tools, Database Optimization, Microsoft SQL Server, PostgreSQL, Job TitleDatabase Programmer Experience5 to 8 Years LocationThane Job TypeFull-time We are looking for an experienced Database Programmer with 5 to 8 years of expertise to join our dynamic team in Thane The ideal candidate should have a strong background in database development and be proficient in related technologies You will be responsible for designing, developing, and optimizing database solutions while ensuring high performance and scalability. Key Responsibilities Design, develop, and maintain database solutions using MS SQL and PostgreSQL. Write and optimize Stored Procedures, Triggers, and Functions. Perform Performance Tuning to ensure efficient database operations. Implement and maintain data integrity and security measures. Utilize Git for version control and collaborative development. Manage and track tasks effectively using JIRA. Work closely with application developers to design efficient database structures. Troubleshoot database-related issues and provide solutions. Optimize queries and indexing strategies for performance enhancement. Ensure database scalability and maintainability to support business growth. Required Skills & Qualifications 5 to 8 years of experience in MS SQL and PostgreSQL development. Proficiency in writing Stored Procedures, Triggers, and Functions. Strong expertise in Performance Tuning and query optimization. Experience with Git for version control and collaborative development. Strong problem-solving skills and the ability to troubleshoot database issues. Experience working with JIRA in an Agile environment. Excellent understanding of database normalization and indexing techniques. Strong analytical and problem-solving skills. Preferred Qualifications Experience with database replication and clustering techniques. Familiarity with cloud-based database solutions. Knowledge of ETL processes and data migration strategies. Understanding of NoSQL databases and when to use them. If you are passionate about database development and optimization, we would love to hear from you! Show more Show less
Posted 1 month ago
1.0 - 5.0 years
5 - 9 Lacs
Mumbai
Work from Office
Skills: SQL, NoSQL Databases, PL/SQL, Python, ETL Tools, Database Optimization, Microsoft SQL Server, PostgreSQL, Job TitleDatabase Programmer Experience5 to 8 Years LocationThane Job TypeFull-time We are looking for an experienced Database Programmer with 5 to 8 years of expertise to join our dynamic team in Thane The ideal candidate should have a strong background in database development and be proficient in related technologies You will be responsible for designing, developing, and optimizing database solutions while ensuring high performance and scalability. Key Responsibilities Design, develop, and maintain database solutions using MS SQL and PostgreSQL. Write and optimize Stored Procedures, Triggers, and Functions. Perform Performance Tuning to ensure efficient database operations. Implement and maintain data integrity and security measures. Utilize Git for version control and collaborative development. Manage and track tasks effectively using JIRA. Work closely with application developers to design efficient database structures. Troubleshoot database-related issues and provide solutions. Optimize queries and indexing strategies for performance enhancement. Ensure database scalability and maintainability to support business growth. Required Skills & Qualifications 5 to 8 years of experience in MS SQL and PostgreSQL development. Proficiency in writing Stored Procedures, Triggers, and Functions. Strong expertise in Performance Tuning and query optimization. Experience with Git for version control and collaborative development. Strong problem-solving skills and the ability to troubleshoot database issues. Experience working with JIRA in an Agile environment. Excellent understanding of database normalization and indexing techniques. Strong analytical and problem-solving skills. Preferred Qualifications Experience with database replication and clustering techniques. Familiarity with cloud-based database solutions. Knowledge of ETL processes and data migration strategies. Understanding of NoSQL databases and when to use them. If you are passionate about database development and optimization, we would love to hear from you! Show more Show less
Posted 1 month ago
5.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together The ETL Developer is responsible for the design, development and maintenance of various ETL processes. This includes the design and development of processes for various types of data, potentially large datasets and disparate data sources that require transformation and cleansing to become a usable data set. This candidate should also be able to find creative solutions to complex and diverse business requirements. The developer should have a solid working knowledge of any programing languages, data analysis, design, ETL tool sets. The ideal candidate must possess solid background on Data Engineering development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with business and technical experts in the team. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 6+ years of development, administration and migration experience in Azure Databricks and Snowflake 6+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 month ago
3.0 - 6.0 years
7 - 11 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Shift Timing - 3.30 pm to 1.00 am Primary Responsibilities Independently analyzes and investigates data and complex issues using business Intelligence tools to produce findings that may be shared with others Independently creates business intelligence solutions and models to support existing analyses, perform new analyses, interpret results, develop actionable insights and present recommendations for use across the company Independently develops tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis Work and collaborate closely with US based leaders as a matrixed team environment Be able to partner directly with business stakeholders to understand business problems, gather requirements, and demonstrate tools, solutions and analysis Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Positions in this function are responsible for the management and manipulation of mostly structured data, with a focus on building business intelligence tools, conducting analysis to distinguish patterns and recognize trends, performing normalization operations and assuring data quality. Depending on the specific role and business line, example responsibilities in this function could include creating specifications to bring data into a common structure, creating product specifications and models, developing data solutions to support analyses, performing analysis, interpreting results, developing actionable insights and presenting recommendations for use across the company. Roles in this function could partner with stakeholders to understand data requirements and develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization. Other roles involved could include producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, query, analytic and statistical software to build solutions, perform analysis and interpret data. Positions in this function work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms. This function is not intended for employees performing the following workproduction of standard or self-service operational reporting, casual inference led (healthcare analytics) or data pattern recognition (data science) analysis; and/or image or unstructured data analysis using sophisticated theoretical frameworks. Generally work is self-directed and not prescribed. Required Qualifications 5+ years of experience in business/finance including analysis experience with a solid understanding of data visualization 5+ years of experience in analysis of business process and workflow and providing an evaluation, benchmark and/or process improvement recommendation 5+ years of experience with SQL 5+ years of experience in business/finance including analysis experience with a solid understanding of data visualization 5+ years of experience in analysis of business process and workflow and providing an evaluation, benchmark and/or process improvement recommendation 4+ years of experience with Alteryx, SAS or other ETL tools Experience with Agile Methodology Proven ability to work in ambiguity Preferred Qualifications Bachelor's degree in Business, Finance, Health Administration or related field 3+ years of total experience in Power BI or Tableau Snowflake Experience Experience with Agile and scrum methodologies Healthcare Experience Knowledge on Azure Fundamentals Knowledge on DBT At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 1 month ago
6.0 - 11.0 years
11 - 16 Lacs
Pune
Work from Office
Job Summary Synechron is seeking an experienced Lead Python with Java Developer to join our team. This critical role is designed to drive the development and integration of robust software solutions that bolster our business objectives. The successful candidate will leverage their expertise in Python, Java, and data management technologies to deliver high-quality applications and contribute to strategic projects within the organization. Software Requirements Required: Proficiency in Python and Java Experience with ETL tools, specifically IBM DataStage Preferred: Familiarity with SQL, BigData, PySpark, and Hive Overall Responsibilities Lead the design, development, and maintenance of scalable software applications using Python and Java. Oversee the integration of ETL processes using IBM DataStage. Collaborate with cross-functional teams to define software requirements and deliver solutions that meet business needs. Ensure the performance, quality, and responsiveness of applications through rigorous testing and optimization. Mentor and guide junior developers, setting best practices and coding standards. Technical Skills (By Category) Programming Languages: Required: Python, Java Databases/Data Management: Preferred: SQL, BigData, Hive Frameworks and Libraries: Preferred: PySpark Development Tools and Methodologies: Required: IBM DataStage Security Protocols: Preferred: Understanding of data security measures in application development Experience Requirements Minimum 6+ years of experience in software development roles. Proven experience in developing applications using Python and Java. Experience in the financial services industry is preferred but not mandatory. Alternative pathways include significant project-based experience in software development and data management. Day-to-Day Activities Develop and maintain high-quality, testable code in Python and Java. Conduct regular code reviews and ensure adherence to best practices. Collaborate with stakeholders to gather requirements and define project scope. Participate in regular team meetings and contribute to project planning sessions. Provide technical leadership and decision-making for software development projects. Qualifications RequiredBachelor’s degree in Computer Science, Information Technology, or a related field. PreferredCertifications in relevant programming languages or ETL tools. Commitment to continuous professional development and staying current with industry trends. Professional Competencies Strong critical thinking and problem-solving capabilities. Demonstrated leadership and teamwork abilities. Excellent communication and stakeholder management skills. Adaptability and a continuous learning orientation. Innovative mindset and ability to manage time and priorities effectively.
Posted 1 month ago
4.0 - 8.0 years
8 - 15 Lacs
Hyderabad
Hybrid
About the Company: DiLytics is a leading Information Technology (IT) Services provider completely focused on providing services in Analytics, Business Intelligence, Data Warehousing, Data Integration and Enterprise Performance Management areas. We have been growing for 12+ years and have offices in the US, Canada and India. We are an employee-friendly company that offers exciting and stress-free work culture and provides career paths where elements of job enrichment and flexibility to move across roles are inherent. Key Responsibilities: Manage a team of ETL developers, assign tasks, and ensure timely delivery of projects and PoCs. Provide technical leadership and groom a team of ETL developers. Design and develop complex mappings, Process Flows and ETL scripts. Perform data extraction and transformation using SQL query to create data set required for dashboards. Optimize ETL processes for efficiency, scalability, and performance tuning. Utilize appropriate ETL tools and technologies (e.g., ODI, ADF, SSIS, Alteryx, Talend, etc.) Stay up to date on the latest ETL trends and technologies. Exposure to designing and developing BI reports and dashboards using Power Bi/Tableau and other tools to meet business analytic needs. Skills Required: Bachelors degree in computer science or related field. Relevant experience of 4 to 8 years. Extensive experience in designing and implementing ETL processes. Experience in designing / developing ETL processes such as ETL control tables, error logging, auditing, data quality, etc. Expertise in Data Integration tool sets - Azure Data Factory, Oracle Data Integrator, SQL Server Integration Services, Talend, etc. - and PL/SQL. Exposure to one or more of these data visualization tools - OAC, Power BI, Tableau, OBIEE. Excellent written, verbal communication and interpersonal skills.
Posted 1 month ago
5.0 - 7.0 years
27 - 32 Lacs
Chennai
Work from Office
Job Title: Lead / Senior Salesforce Developer Experience Range: 6-8 years in Salesforce (10+ years overall in IT) Hiring Location: Bangalore, Chennai, Trivandrum, Kochi, Pune & Hyderabad Must Have Skills 6-8 years hands-on Salesforce.com development experience (Apex, Visualforce, Lightning) Strong experience in Salesforce Lightning (Aura framework), creating Lightning Components Hands-on with Apex, Visualforce, Triggers, SOQL, JavaScript, and Salesforce DX tools Proven experience on at least 3 Salesforce Lightning projects Expertise in Salesforce administration, configuration, and customization Implementation experience with workflow rules, validation rules, approval processes, reports, and dashboards Experience integrating Salesforce with external systems using REST/SOAP APIs Hands-on experience with Sales, Service, and Community Clouds Production deployment using change sets, Eclipse/ANT migration tools Proficiency in Apex Data Loader and ETL tools (e.g., Informatica, Mulesoft, Boomi, Cast Iron) Experience with SQL/PL-SQL and database development Strong understanding of coding best practices and governor limits Excellent communication skills and ability to work with US-based stakeholders Knowledge of Salesforce admin activities - user/role/profile setup, security configuration Good to Have Skills Experience in the healthcare domain Experience with Salesforce1/Mobile, Heroku, Radian6, Einstein Analytics Familiarity with HTML5, CSS, AJAX, XML, JQuery, or other JS frameworks Prior development experience in Java or .NET Exposure to Google APIs and other third-party integrations Salesforce Certifications Must Have: Platform Developer I App Builder Desirable: Platform Developer II (or DEV 501 - all 3 parts completed) Good to Have: Advanced Administrator (ADM 301) Consultant Certifications (Sales Cloud / Service Cloud) Service Cloud Certification Required Skills Salesforce,apex,visual force
Posted 1 month ago
4.0 - 9.0 years
1 - 1 Lacs
Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)
Work from Office
Stakeholder & Vendor Management Project Coordination & Execution Risk & Compliance Oversight Team Support & Development Reporting & Performance Analysis Agile Methodology PMP Waterfall Project management tools, data analysis techniques.
Posted 1 month ago
1.0 - 5.0 years
3 - 7 Lacs
Mumbai
Work from Office
Data Engineer identifies the business problem and translates these to data services and engineering outcomes. You will deliver data solutions that empower better decision making and flexibility of your solution that scales to respond to broader business questions. key responsibilities As a Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Searce. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML Understand the business problem and translate these to data services and engineering outcomes Explore new technologies and learn new techniques to solve business problems creatively Think big! and drive the strategy for better data quality for the customers Collaborate with many teams - engineering and business, to build better data products preferred qualifications Over 1-2 years of experience with Hands-on experience of any one programming language (Python, Java, Scala) Understanding of SQL is must Big data (Hadoop, Hive, Yarn, Sqoop) MPP platforms (Spark, Pig, Presto) Data-pipeline & schedular tool (Ozzie, Airflow, Nifi) Streaming engines (Kafka, Storm, Spark Streaming) Any Relational database or DW experience Any ETL tool experience Hands-on experience in pipeline design, ETL and application development Good communication skills Experience in working independently and strong analytical skills Dependable and good team player Desire to learn and work with new technologies Automation in your blood
Posted 1 month ago
9.0 - 14.0 years
4 - 7 Lacs
Bengaluru
Work from Office
This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)
Posted 1 month ago
4.0 - 8.0 years
0 - 0 Lacs
Bengaluru
Hybrid
Job Description About this role: Wells Fargo is seeking a Senior Analytics Consultant In this role, you will: Consult, review and research moderately complex business, operational, and technical challenges that require an in-depth evaluation of variable data factors Perform moderately complex data analysis to support and drive strategic initiatives and business needs Develop a deep understanding of technical systems and business processes to extract data driven insights while identifying opportunities for engineering enhancements Lead or participate on large cross group projects Mentor less experienced staff Collaborate and consult with peers, colleagues, external contractors, and mid-level managers to resolve issues and achieve goals Leverage a solid understanding of compliance and risk management requirements for supported area Required Qualifications: 4+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Job Expectations: Design, develop, and maintain dashboards using Tableau and Qlik Sense. Build clean, intuitive, and responsive UIs tailored to end-user needs. Participate in data sourcing and transformation processes using SQL and other ETL tools. Develop and execute unit tests and assist with UAT processes for all developed reports. Ensure high quality and consistency across all visual outputs, performing QA prior to delivery. Collaborate with the technical lead and delivery leads to align on requirements and timelines. Support post-deployment monitoring and address production issues as needed. Continuously optimize dashboards for better performance and data refresh times. Follow SDLC protocols including documentation, testing, and change management procedures. Deliver timely and high-quality dashboards that meet business specifications. Demonstrate strong analytical thinking and attention to detail. Work in close coordination with a global team to support operational and strategic reporting. Participate in performance tuning and error resolution. Adapt to evolving technologies and tools as needed.
Posted 1 month ago
0.0 - 5.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Title Snaplogic Experience 0-5Years Location Bangalore : Snaplogic
Posted 1 month ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Work from Office
Job Title:Snowflake Developer Experience6-8 Years Location:Chennai - Hybrid : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane