Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Department: SE DC DG A Snapshot of Your Day We are seeking a highly skilled and committed Manager to lead our data accountability, data security, data regulatory compliance, data access and Artificial Intelligence (AI) governance workstreams. This position requires an initiative-taking leader with a robust background in data governance, data ownership, data security and compliance, who through their team can drive the development and implementation of comprehensive strategies to support our data roles, secure our data assets and uphold ethical AI practices to maintain data integrity and increase a culture of data accountability. How You’ll Make An Impact Data Accountability: Develop and implement policies and procedures to improve data accountability across the organization. the Data Roles & Responsibilities Framework and Help in its operationalization in Siemens Energy. Lead Enablement and Training Programs for the data roles. Support the build of Data communities in the business areas and bring them closer to our formally established data roles. Data Security: Lead the development and execution of comprehensive data security and data access management strategies, policies and procedures which are essential for protecting an organization's data assets, mitigating risks, ensuring compliance, and maintaining collaborator trust. Data Retention: Lead development of retention framework and secure disposal methods to retain data only as vital and dispose of it securely. AI Governance: Lead AI Governance team working on establishing policies and processes to ensure ethical, responsible, and compliant use of AI. Work closely with multi-functional teams of Business, Data domains, Cyber security, Legal and Compliance, Artificial Intelligence, Applications teams etc. Manage and mentor a team of data professionals, providing guidance, training, and support to achieve departmental goals. Develop and implement strategic goals for the team, aligned with organizational objectives. Partner Engagement: Collaborate with collaborators across the organization to align strategies with business objectives. Innovation: Stay abreast of industry trends and emerging technologies and incorporate innovative solutions to enhance data accountability, data security AI governance What You Bring Bachelor’s degree in computer science, information technology, data science, or a related field; master’s degree preferred. Minimum of 8 years of experience in data management, data governance, data ownership and data security governance roles in large scale enterprise set up with at least 3 years in a leadership capacity. Demonstrable ability to develop & implement data governance frameworks. Excellent leadership, communication, and interpersonal skills. Strong analytical, problem-solving, and decision-making abilities. Ability to work effectively in a fast-paced, dynamic environment and manage multiple priorities. Solid understanding of data accountability or data ownership, data security and compliance principles and practices. Familiarity with data governance tools like Collibra, Informatica, or Ataccama, or Talend. Experience with data modeling, data architecture, and database management systems is preferred. About The Team Our Corporate and Global Functions are essential in driving the company's pivotal initiatives and ensuring operational excellence across various groups, business areas, and regions. These roles support our vision to become the most valued energy technology company in the world. As part of our team, you contribute to our vision by shaping the global energy transition, partnering with our internal and external collaborators, and conducting business responsibly and in compliance with legal requirements and regulations. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With ~100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world's electricity generation. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character – no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits Opportunities to work with a distributed team Opportunities to work on and lead a variety of innovative projects Medical benefits Time off/Paid holidays and parental leave Continual learning through the Learn@Siemens-Energy platform https://jobs.siemens-energy.com/jobs Show more Show less
Posted 3 weeks ago
4.0 - 6.0 years
7 - 9 Lacs
Bengaluru
Work from Office
About your role This role serves as a member of Fidelity India team under PSO umbrella, supporting Fidelity Clearing Canada (FCC) technology team in a technical support and developer capacity. You will work to embed innovation across the business, maintaining consistency and standards to maximise business benefits. You will ensure the seamless operation of automated workflows, scripts, and orchestration processes for FCC applications currently in production. This includes proactive monitoring, rapid incident resolution, scripting and automation development, collaboration with cross-functional teams, and continuous improvement efforts to enhance system performance, reliability, and security. The goal is to maintain optimal functionality, minimize downtime, and contribute to the overall efficiency of automated systems, aligning with organizational objectives and standards. You will also be responsible for the development, enhancements, and maintenance of application solutions for internal and external clients. You will work to progress Fidelitys PSO and FCC Technology support team agenda by: Application Development Support Ensure that all requests raised by clients and users are handled timely and appropriately by possessing technical knowledge of operating systems, applications, and software development lifecycle. Provide technical support to teams within the organization, and to external clients when required. Update technical documents and procedures to reflect current state. Provide support for application deployments. Assist with systems integration when needed. Collaborate with On-Site Application Development Support Team. Appian Application Support Troubleshoot, fix and enhance the defects raised in Appian based applications; Understand the differences between REST, SOAP and the basic design principles in integrating to Web Services; Debug issues in Interfaces, Process Models and integrations and provide short term/long term solutions. Identify chokepoints and provide design recommendations to enhance the performance of the Application Provide technical guidance to junior developers as and when required; Defect Remediation Remediate defects based on business and client priorities to address service disruptions, incidents, and problems. Client Experience Deliver quality customer service interactions to our internal and external customers to create a positive experience. Take ownership of solving a customers problem promptly; use all available resources to achieve the best outcome. About You Skills and Knowledge Strong technical insight and experience to inform, guide, challenge and support technical decisions. Strong analytical, conceptual, and innovative problem-solving abilities. Strong attention to detail. Ability to work independently while being in a team environment. Excellent communication skills both written and oral; ability to effectively communicate technical material to non-technical users. Goal-oriented and a self-starter. Ability to quickly learn, adapt and change to meet the needs of a changing environment. Ability to explain complex ideas to those with limited IT and systems knowledge. Excellent problem-solving skills. Customer service oriented. Ability to work in a fast-paced environment without direct supervision. Development or Support experience in the Canadian Financial industry is an asset. Track record of actively seeking opportunities for process improvements, efficiency gains, and system optimizations in the context of automation and orchestration Experience and Qualifications Job Related Experience Minimum Requirement: 4+ years Must Have: 3+ years of experience as a developer or programmer/support engineer, including 2+ years of experience in the brokerage securities/Asset Management industry. 2+ Years of Appian BPM (or similar) Hands-On Development Experience 1+ years of experience and intermediate level knowledge of Java/J2EE development, including Spring, Hibernate, MyBatis, JPA, RESTful API, Spring Boot. Strong Hands-On knowledge of SQL and database platforms such as: MySQL, SQL Server, Oracle, database design Handy knowledge of Unix/Linux operating system and Shell scripts Exposure to Automated testing, DevOps, Change Management concepts Experience with Agile Development Methodologies. Nice to Have: Experience with the following: PowerBI, Talend, ETL, Data Warehouse, Control-M uniFide, Salesforce or Dataphile platform would be an asset. Working knowledge of HTML and Adaptive/Responsive Design DocuSign and document management platforms Atlassian stack (JIRA, Confluence) Hands-on expert in creating high performance web applications leveraging React, Angular 2. Some knowledge of concepts such as TypeScript, Bootstrap Grid System, Dependency Injections, SPA (Single Page Application). Experience with cloud-based implementations. Experience in setting up Secure File Transfer Protocols (SFTP) and file delivery. AWS would be an asset. Education: First degree level (Bachelors degree) or equivalent in Computer Science Knowledge of the financial service industry Dynamic Working This role is categorised as Hybrid (Office/Remote).
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsibilities Talend Developer Talend Development Required Skills Design and Develop programs using RPA tools like Automation Anywhere, Xceptor. Exposure to other RPA tools like Work Fusion, UI Path Knowledge of Java preferable Proven track record in successful automation of processes Must have three to seven years of development experience out of which at least one to two years using RPA tool Must have good knowledge of SQL Strong analytical, debugging and bug fixing skills Document defects in defect tracking tool and track them to closure Collate and publish status updates Added Advantage if worked on JIRA, VB Scripting, Excel Macros, Should have knowledge of Banking Domain Good to have skills Should have good communication and analytical skills Should have team leading skills and should have ability to collaborate effectively with coworkers Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of big data platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Architect – Data Integration & Engineering Location: Hybrid Experience: 8+ years Job Summary: We are seeking an experienced Data Architect specializing in data integration, data engineering, and hands-on coding to design, implement, and manage scalable and high-performance data solutions. The ideal candidate should have expertise in ETL/ELT, cloud data platforms, big data technologies, and enterprise data architecture. Key Responsibilities: 1. Data Architecture & Design: Develop enterprise-level data architecture solutions, ensuring scalability, performance, and reliability. Design data models (conceptual, logical, physical) for structured and unstructured data. Define and implement data integration frameworks using industry-standard tools. Ensure compliance with data governance, security, and regulatory policies (GDPR, HIPAA, etc.). 2. Data Integration & Engineering: Implement ETL/ELT pipelines using Informatica, Talend, Apache Nifi, or DBT. Work with batch and real-time data processing tools such as Apache Kafka, Kinesis, and Apache Flink. Integrate and optimize data lakes, data warehouses, and NoSQL databases. 3. Hands-on Coding & Development: Write efficient and scalable code in Python, Java, or Scala for data transformation and processing. Optimize SQL queries, stored procedures, and indexing strategies for performance tuning. Build and maintain Spark-based data processing solutions in Databricks and Cloudera ecosystems. Develop workflow automation using Apache Airflow, Prefect, or similar tools. 4. Cloud & Big Data Technologies: Work with cloud platforms such as AWS (Redshift, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Dataflow). Manage big data processing using Cloudera, Hadoop, HBase, and Apache Spark. Deploy containerized data services using Kubernetes and Docker. Automate infrastructure using Terraform and CloudFormation. 5. Governance, Security & Compliance: Implement data security, masking, and encryption strategies. Define RBAC (Role-Based Access Control) and IAM policies for data access. Work on metadata management, data lineage, and cataloging. Required Skills & Technologies: Data Engineering & Integration: ETL/ELT Tools: Informatica, Talend, Apache Nifi, DBT Big Data Ecosystem: Cloudera, HBase, Apache Hadoop, Spark Data Streaming: Apache Kafka, AWS Kinesis, Apache Flink Data Warehouses: Snowflake, AWS Redshift, Google Big Query, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB, Cassandra Programming & Scripting: Languages: Python, Java, Scala Scripting: Shell, PowerShell, Bash Frameworks: PySpark, SparkSQL Cloud & DevOps: Cloud Platforms: AWS, Azure, GCP Containerization & Orchestration: Kubernetes, Docker CI/CD Pipelines: Jenkins, GitHub Actions, Terraform, CloudFormation Security & Governance: Compliance Standards: GDPR, HIPAA, SOC 2 Data Cataloging: Collibra, Alation Access Controls: IAM, RBAC, ABAC Preferred Certifications: AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Databricks Certified Data Engineer Associate/Professional Cloudera Certified Data Engineer Informatica Certified Professional Education & Experience: Bachelor's/Master’s degree in Computer Science/ MCA, Data Engineering, or a related field. 8+ years of experience in data architecture, integration, and engineering. Proven expertise in designing and implementing enterprise-scale data solutions. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives. DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. We are seeking committed and talented MDM Engineers to join our new FoundationX team, which lies at the heart of DigitalX. As a member within FoundationX, you will be playing a critical role in ensuring our MDM systems are operational, scalable and continue to contain the right data to drive business value. You will play a pivotal role in building maintaining and enhancing our MDM systems. This position is based in India and may require on-site work from time to time. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5+ + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5++ years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 3 weeks ago
8.0 - 12.0 years
15 - 27 Lacs
Mumbai, Pune, Bengaluru
Work from Office
Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager
Posted 3 weeks ago
6.0 years
0 Lacs
India
Remote
Job Type: Contract Location: Remote Experience: 7+ yrs Job Description: · Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies · Monitoring active ETL jobs in production. · Build out data lineage artifacts to ensure all current and future systems are properly documented · Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes · Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies · Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations · Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. · Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: · This job has no supervisory responsibilities. · Need strong experience with Snowflake and Azure Data Factory(ADF). · Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work · 5+ years’ experience with a strong proficiency with SQL query/development skills · Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks · Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) · Experience working in the healthcare industry with PHI/PII · Creative, lateral, and critical thinker · Excellent communicator · Well-developed interpersonal skills · Good at prioritizing tasks and time management · Ability to describe, create and implement new solutions · Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) · Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) · Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume). Show more Show less
Posted 3 weeks ago
2.0 - 4.0 years
3 - 8 Lacs
Coimbatore
Work from Office
We are looking for an experienced ETL Developer to join our team. The ideal candidate will have strong experience in ETL tools and processes, particularly with Talend, Informatica, Apache Nifi, Pentaho, or SSIS. The role requires excellent technical knowledge of databases, particularly MySQL, and a strong ability to integrate data from multiple sources. The candidate must also have strong manual testing skills and experience using version control systems such as GIT, along with project tracking tools like JIRA. Key Responsibilities: Design, develop, and implement ETL processes using tools like Talend, Informatica, Apache Nifi, Pentaho, or SSIS. Develop and maintain data pipelines to integrate data from various sources including APIs, cloud storage, and third-party applications. Perform data mapping, data transformation, and data cleansing to ensure data quality. Write complex SQL queries for data extraction, transformation, and loading from MySQL databases. Collaborate with cross-functional teams to understand data requirements and provide scalable solutions. Conduct manual testing to ensure the accuracy and performance of ETL processes and data. Manage version control with GIT, and track project progress in JIRA. Troubleshoot and resolve issues related to ETL processes, data integration, and testing. Ensure adherence to best practices for ETL design, testing, and documentation. Required Skills: 2.5 to 4 years of experience in ETL development with tools such as Talend, Informatica, Apache Nifi, Pentaho, or SSIS. Strong hands-on experience with MySQL databases. Proven ability to integrate data from diverse sources (APIs, cloud storage, third-party apps). Solid manual testing experience, with a focus on ensuring data accuracy and process integrity. Familiarity with GIT for version control and JIRA for project management. Strong problem-solving skills with the ability to troubleshoot and resolve technical issues. Excellent communication skills and the ability to collaborate with cross-functional teams. Preferred Skills: Experience working with cloud platforms such as AWS, Azure, or GCP. Knowledge of automation frameworks for testing ETL processes. Qualifications: Bachelors degree in computer science, Information Technology, or a related field (or equivalent work experience).
Posted 3 weeks ago
5.0 - 10.0 years
5 - 12 Lacs
Vijayawada
Work from Office
Roles & Responsibilities Primary Responsibilities: Design and develop complex T-SQL queries , stored procedures , triggers, functions, and views. Build and maintain SSRS reports and Power BI dashboards for internal and client-facing needs. Optimize SQL queries for high performance and ensure database reliability and integrity . Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions. Perform regular data validation , data profiling , and troubleshooting to maintain high data quality. Secondary Responsibilities: Develop and maintain ETL pipelines using Talend or equivalent tools for efficient data integration. Contribute to data warehousing and data modeling efforts to support reporting and analytics. Participate in code reviews , best practice enforcement, and documentation. Work with DevOps/Infrastructure teams to manage and monitor SQL Server environments . Contribute to automation and process improvement initiatives across BI and data functions. Primary Skills (Must-Have): SQL Server Development (T-SQL, Stored Procedures, Functions) SSRS (SQL Server Reporting Services) Power BI (Report and Dashboard Development) Strong understanding of Relational Databases and SQL Query Optimization Secondary Skills (Good-to-Have): Talend or other ETL Tools Data Warehousing & Data Modeling Knowledge of Agile development processes Version Control (Git, TFS, etc.) Exposure to cloud platforms (Azure, AWS optional) Preferred Candidate Profile: Immediate joiners or candidates with less than 15 days notice period preferred. Willing to relocate or work from Vijayawada location (on-site/hybrid as per company policy). Strong verbal and written communication skills to coordinate with teams and stakeholders. Proven ability to work independently and handle multiple priorities in a fast-paced environment. 3 to 6 years of relevant experience in SQL and BI development. Location : Vijayawada (On-site/Hybrid) Experience Level : 5–10 Years Employment Type : Full-Time Notice Period : Immediate
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
India
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus More About This Role You will be part of Mondelēz biggest Analytics Team, delivering world class analytical solutions showcasing global level business impact using sophisticated tools and technologies. You will be driving Visualization and Data Processing capabilities to next level by delivering business solutions impacting business on day-to-day productivity. Build trust and credibility with different stakeholders to achieve common organizational goals and targets Develop & scale up global solutions under advanced analytics and reporting Domain for Mondelez Supply Chain. Drive automation by performing best in class technology integration in Mondelez (SAP, SQL, GCP) Build best in class interactive analytics tools with superior user experience by leveraging PowerBI /Dash Python/JS (react) Develop business solutions which are scalable, automated, self-sustainable and interactive with end users Collaborate with external partners to deliver strategic projects on time and right first time. Develop & execute strategy to automate KPI all existing dashboards and dataflows. Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics can deliver Sustain existing reports and dashboard ensuring timely update and no downtime. What you need to know about this position: What extra ingredients you will bring: Education / Certifications: Bachelor’s or master’s degree, in a quantitative field such as Statistics, Applied Mathematics, Engineering, Data Science Job specific requirements: Demonstrated experience in applying analytical techniques to solve business problems Proven experience in analytics and reporting projects with Cross-Functional and Global stakeholders Experience in Visualization Tools (Tableau/Power BI/Spotfire) Experience in Data Management and Processing Tools (Talend/Alteryx/R/Prep/SQL) Experience in web app development tools and projects may be advantageous (PowerBI / Dash / React / JS) Experience in Statistical analytical tools and projects may be advantageous Experience in using SAP may be advantageous Extensive experience in bringing data together from multiple sources, bring out insights & showcase in an easy-to-understand visualization Experience in FMCG/Food Products/Supply Chain Industry. Good communication skills Total relevant experience of 3-5 years Travel requirements: Work schedule: Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 3 weeks ago
5.0 - 6.0 years
9 - 12 Lacs
Pune
Hybrid
Job Description (Cabinetworks Services India) Job Title: ETL TALEND DEVELOPER JOB SUMMARY: The ETL Talend Developer is responsible for developing, configuring, supporting, and modifying complex integrated business and/or enterprise application solutions within various computing environments. KEY DUTIES & RESPONSIBILITIES: Translating business requirements and functional specifications into logical program designs, code modules, stable application systems, and software solutions. Developing, configuring, supporting, and modifying complex integrated business and/or enterprise application solutions within various computing environments. Facilitate the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications. Produce clean, efficient code based on industry level specifications and best practices. Recommend and execute improvements. Support system integration testing (SIT), user acceptance testing (UAT), participate and coach others in all software development lifecycle phases. Create technical documentation for reference and reporting. Report progress and issues to Project Leader/Manager on a regular basis and ensure project schedules are met. EDUCATION/LICENSES/CERTIFICATION/FORMAL TRAINING REQUIREMENT (As applicable): Bachelors degree or equivalent in Electronics, Electrical, Computer Science or Mechanical engineering is preferred. EXPERIENCE REQUIREMENT (number of years range & notes as applicable) Minimum of 4+ years of relevant experience as a Talend Developer. Hands-on experience with Talend ESB : environment setup, web services, data prep and administration of run time, karaf containers, camel routes desired. Hands-on experience working with Talend Data Integration : Configuring TAC and Studio, build, execute and manage Talend jobs needed. Hands on experience of working with SQL, PL/SQL and data integration tools. Hands on experience with Cloud migration and Talend Cloud Management console, API services and Platform Administrator is a MUST . ESSENTIAL SKILLS & QUALIFICATIONS: (MUST HAVE) Ability to design, develop, launch, scale, monitor and maintain Talend ETLs. Strong experience in Data Quality, Source Systems Analysis, Business Rules Validation, Source Target Mapping Design, Performance Tuning and High-Volume Data Loads. A working knowledge of source-code control using any tool available in the market is preferred. Experience with PL/SQL coding with intensive data analysis, planning and conducting integrated testing of ETL. Translate business requirements into well-architected customized solutions that best leverage the platform and products. Ability to participate in efforts to develop and execute testing, training, and documentation. Knowledge on Talend limitations, best practices, deployments, release management, project methodologies, manual testing processes, etc. Ability to work as an individual contributor to solve complex problems through research and analysis. Excellent organizational, verbal, and written communication skills to work with individuals and teams across the organization in English language.
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
Mawal, Maharashtra, India
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus More About This Role You will be part of Mondelēz biggest Analytics Team, delivering world class analytical solutions showcasing global level business impact using sophisticated tools and technologies. You will be driving Visualization and Data Processing capabilities to next level by delivering business solutions impacting business on day-to-day productivity. Build trust and credibility with different stakeholders to achieve common organizational goals and targets Develop & scale up global solutions under advanced analytics and reporting Domain for Mondelez Supply Chain. Drive automation by performing best in class technology integration in Mondelez (SAP, SQL, GCP) Build best in class interactive analytics tools with superior user experience by leveraging PowerBI /Dash Python/JS (react) Develop business solutions which are scalable, automated, self-sustainable and interactive with end users Collaborate with external partners to deliver strategic projects on time and right first time. Develop & execute strategy to automate KPI all existing dashboards and dataflows. Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics can deliver Sustain existing reports and dashboard ensuring timely update and no downtime. What you need to know about this position: What extra ingredients you will bring: Education / Certifications: Bachelor’s or master’s degree, in a quantitative field such as Statistics, Applied Mathematics, Engineering, Data Science Job specific requirements: Demonstrated experience in applying analytical techniques to solve business problems Proven experience in analytics and reporting projects with Cross-Functional and Global stakeholders Experience in Visualization Tools (Tableau/Power BI/Spotfire) Experience in Data Management and Processing Tools (Talend/Alteryx/R/Prep/SQL) Experience in web app development tools and projects may be advantageous (PowerBI / Dash / React / JS) Experience in Statistical analytical tools and projects may be advantageous Experience in using SAP may be advantageous Extensive experience in bringing data together from multiple sources, bring out insights & showcase in an easy-to-understand visualization Experience in FMCG/Food Products/Supply Chain Industry. Good communication skills Total relevant experience of 3-5 years Travel requirements: Work schedule: Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 3 weeks ago
4.0 - 5.0 years
6 - 10 Lacs
Kochi, Bengaluru
Work from Office
4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About GSPANN GSPANN is a global IT services and consultancy provider headquartered in Milpitas, California (U.S.A.). With five global delivery centers across the globe, GSPANN provides digital solutions that support the customer buying journeys of B2B and B2C brands worldwide. With a strong focus on innovation and client satisfaction, GSPANN delivers cutting-edge solutions that drive business success and operational excellence. GSPANN helps retail, finance, manufacturing, and high-technology brands deliver competitive customer experiences and increased revenues through our solution delivery, technologies, practices, and operations for each client. For more information, visit www.gspann.com JD for your reference: We are looking for a passionate Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight and insight into business decisions. Job Position-Data Modeller Experience- 5+ years Location- Hyderabad, Gurugram Skills- Data Modeling, Data Analysis, Cloud and SQL Responsibilities: Design and develop conceptual, logical, and physical data models for databases, data warehouses, and data lakes. Translate business requirements into data structures that fit both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) environments.Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. 3+ years of experience as a Data Modeler or in a related role. Proficiency in data modeling tools (Erwin, ER/Studio, SQL Developer Data Modeler). Strong experience with SQL and database technologies (Oracle, SQL Server, MySQL, PostgreSQL). Familiarity with ETL tools (Informatica, Talend, Apache NiFi) and data integration techniques. Knowledge of data warehousing concepts and data lake architecture. Understanding of Big Data technologies (Hadoop, Spark) is a plus. Experience with cloud platforms like AWS, GCP, or Azure Why Choose GSPANN? At GSPANN, we don’t just serve our clients—we co-create. The GSPANNians are passionate technologists who thrive on solving the toughest business challenges, delivering trailblazing innovations for marquee clients. This collaborative spirit fuels a culture where every individual is encouraged to sharpen their skills, feed their curiosity, and take ownership to learn, experiment, and succeed. We believe in celebrating each other’s successes—big or small—and giving back to the communities we call home. If you’re ready to push boundaries and be part of a close-knit team that’s shaping the future of tech, we invite you to carry forward the baton of innovation with us. Let’s Co-Create the Future—Together. Discover Your Inner Technologist Explore and expand the boundaries of tech innovation without the fear of failure. Accelerate Your Learning Shape your career while scripting the future of tech. Seize the ample learning opportunities to grow at a rapid pace. Feel Included At GSPANN, everyone is welcome. Age, gender, culture, and nationality do not matter here, what matters is YOU. Inspire and Be Inspired When you work with the experts, you raise your game. At GSPANN, you’re in the company of marquee clients and extremely talented colleagues. Enjoy Life We love to celebrate milestones and victories, big or small. Ever so often, we come together as one large GSPANN family. Give Back Together, we serve communities. We take steps, small and large so we can do good for the environment, weaving in sustainability and social change in our endeavors. We invite you to carry forward the baton of innovation in technology with us. Let’s Co-Create GSPANN | Consulting Services, Technology Services, and IT Services Provider GSPANN provides consulting services, technology services, and IT services to e-commerce businesses with high technology, manufacturing, and financial services. Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for a 3–7-year experienced Middleware / Interface developer. Work mode: Hybrid Office location: Chennai Responsibilities: • Develop and maintain system interfaces and APIs. • Ensure seamless integration between software applications. • Work with RESTful and SOAP APIs for data exchange. • Optimize data transmission processes for performance and reliability. • Troubleshoot and resolve interface issues. Required Skills: • Proficiency in API development (REST, SOAP, GraphQL). • Strong experience in Java, Python, .NET, or Node.js. • Hands-on experience with integration platforms (MuleSoft, Boomi, Talend, etc.). Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Catalogue Specialist Career Level : D3 Introduction to role The GBS Data Office (GDO) supports Commercial and Enabling Units business areas. As a Data Catalogue Specialist, you will be responsible for the systems, business rules, and processes that ensure data is findable, accessible, and fit for use by one or more business units. You will achieve this through the capture of metadata and development of the data catalogue. Accountabilities Support the Data Catalogue Principal to define Information Asset Registers across business areas to help profile information risk/value Participate in projects to mitigate and control identified priority risk areas Take responsibility for nominated markets/business areas, develop domain knowledge and leverage internal customer relationships to respond to localised use cases Act as point of contact for nominated business areas or markets Support initiatives to enhance the reusability and transparency of our data by making it available in our global data catalogue Support the capture of user requirements for functionality and usability, and document technical requirements Work with IT partners to capture metadata for relevant data sets and lineage, and populate the catalogue Work with data stewards and business users to enrich catalogue entries with business data dictionary, business rules, glossaries Execute monitoring controls to assure metadata quality remains at a high level Support catalogue principles and data governance leads for tool evaluation and UAT Essential Skills/Experience Demonstrable experience of working in a data management, data governance or data engineering domain Strong business and system analysis skills Proven experience with Data Catalogue, Search and Automation software (Collibra, Informatica, Talend etc) Ability to interpret and communicate technical information into business language and in alignment with AZ business Solid understanding of metadata harvesting methodologies and ability to create business and technical metadata sets Strong engagement, communication and stakeholder management skills, including excellent organisational, presentation and influencing skills High level of proficiency with common business applications (Excel, Visio, Word, PowerPoint & SAP business user) Desirable Skills/Experience Proven experience of working with Commercial or Finance data and systems (Veeva, Reltio, SAP) and consumption Domain knowledge of life sciences/pharmaceuticals; manufacturing; corporate finance; or sales & marketing Experience with data quality and profiling software Experience of working in a complex, diverse global organisation When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. Join an ambitious company that's on a mission to do more and be more! Help shape our journey towards becoming a tech-enabled enterprise. Encouraged to look to the future, think strategically, and innovate, we influence progress today. We are introducing new solutions and technology through our investment in automated processes. It's by driving greater consistency and efficiency that we will help accelerate our next chapter of growth. Ready to make a difference? Apply now! Show more Show less
Posted 3 weeks ago
12.0 - 22.0 years
25 - 40 Lacs
Bangalore Rural, Bengaluru
Work from Office
Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner
Posted 4 weeks ago
13.0 - 23.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Role : Snowflake Practice Lead / Architect / Solution Architect Exp : 13+ Years Work Location : Hyderabad Position Overview : We are seeking a highly skilled and experienced - Snowflake Practice Lead- to drive our data strategy, architecture, and implementation using Snowflake. This leadership role requires a deep understanding of Snowflake's cloud data platform, data engineering best practices, and enterprise data management. The ideal candidate will be responsible for defining best practices, leading a team of Snowflake professionals, and driving successful Snowflake implementations for clients. Key Responsibilities : Leadership & Strategy : - Define and drive the Snowflake practice strategy, roadmap, and best practices. - Act as the primary subject matter expert (SME) for Snowflake architecture, implementation, and optimization. - Collaborate with stakeholders to understand business needs and align data strategies accordingly. Technical Expertise & Solutioning : - Design and implement scalable, high-performance data architectures using - Snowflake- . - Develop best practices for data ingestion, transformation, modeling, and security- within Snowflake. - Guide clients on Snowflake migrations, ensuring a seamless transition from legacy systems. - Optimize - query performance, storage utilization, and cost efficiency- in Snowflake environments. Team Leadership & Mentorship : - Lead and mentor a team of Snowflake developers, data engineers, and architects. - Provide technical guidance, conduct code reviews, and establish best practices for Snowflake development. - Train internal teams and clients on Snowflake capabilities, features, and emerging trends. Client & Project Management : - Engage with clients to understand business needs and design tailored Snowflake solutions. - Lead - end-to-end Snowflake implementation projects , ensuring quality and timely delivery. - Work closely with - data scientists, analysts, and business stakeholders- to maximize data utilization. Required Skills & Experience : - 10+ years of experience- in data engineering, data architecture, or cloud data platforms. - 5+ years of hands-on experience with Snowflake in large-scale enterprise environments. - Strong expertise in SQL, performance tuning, and cloud-based data solutions. - Experience with ETL/ELT processes, data pipelines, and data integration tools- (e.g., Talend, Matillion, dbt, Informatica). - Proficiency in cloud platforms such as AWS, Azure, or GCP, particularly their integration with Snowflake. - Knowledge of data security, governance, and compliance best practices . - Strong leadership, communication, and client-facing skills. - Experience in migrating from traditional data warehouses (Oracle, Teradata, SQL Server) to Snowflake. - Familiarity with Python, Spark, or other big data technologies is a plus. Preferred Qualifications : - Snowflake SnowPro Certification- (e.g., SnowPro Core, Advanced Architect, Data Engineer). - Experience in building data lakes, data marts, and real-time analytics solutions- . - Hands-on experience with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC)- in Snowflake environments. Why Join Us? - Opportunity to lead cutting-edge Snowflake implementations- in a dynamic, fast-growing environment. - Work with top-tier clients across industries, solving complex data challenges. - Continuous learning and growth opportunities in cloud data technologies. - Competitive compensation, benefits, and a collaborative work culture.
Posted 4 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description Codenia Technologies LLP specializes in delivering innovative and tailored solutions across web development, mobile applications, and AI-powered tools to help businesses thrive in the digital era. The company focuses on crafting cutting-edge websites, developing intuitive mobile apps, and harnessing AI for smarter, data-driven solutions. Codenia stands out for its quality, creativity, and efficiency in exceeding expectations and fostering long-lasting partnerships. Role Description This is a full-time on-site role located in Gurgaon for a Data Analyst at Codenia Technologies LLP. You need to possess the drive and ability to deliver on projects without constant supervision. Expertise Needed in - Power BI, ETL, Databricks. Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 4+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimisation. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Show more Show less
Posted 4 weeks ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
The HiLabs Story HiLabs is a leading provider of AI-powered solutions to clean dirty data, unlocking its hidden potential for healthcare transformation. HiLabs is committed to transforming the healthcare industry through innovation, collaboration, and a relentless focus on improving patient outcomes. HiLabs Team Multidisciplinary industry leaders Healthcare domain experts AI/ML and data science experts Professionals hailing from the worlds best universities, business schools, and engineering institutes including Harvard, Yale, Carnegie Mellon, Duke, Georgia Tech, Indian Institute of Management (IIM), and Indian Institute of Technology (IIT). Be a part of a team that harnesses advanced AI, ML, and big data technologies to develop cutting-edge healthcare technology platform, delivering innovative business solutions. Job Title : Data Engineer II /Sr Data Engineer Job Location : Bangalore, Karnataka , India Job summary: We are a leading Software as a Service (SaaS) company that specializes in the transformation of data in the US healthcare industry through cutting-edge Artificial Intelligence (AI) solutions. We are looking for Software Developers, who should continually strive to advance engineering excellence and technology innovation. The mission is to power the next generation of digital products and services through innovation, collaboration, and transparency. You will be a technology leader and doer who enjoys working in a dynamic, fast-paced environment. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines to ingest and transform large datasets from various sources. Optimize and manage databases (SQL/NoSQL) to ensure efficient data storage, retrieval, and manipulation for both structured and unstructured data. Collaborate with data scientists, analysts, and engineers to integrate data from disparate sources and ensure smooth data flow between systems. Implement and maintain data validation and monitoring processes to ensure data accuracy, consistency, and availability. Automate repetitive data engineering tasks and optimize data workflows for performance and scalability. Work closely with cross-functional teams to understand their data needs and provide solutions that help scale operations. Ensure proper documentation of data engineering processes, workflows, and infrastructure for easy maintenance and scalability Desired Profile: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 3-5 years of hands-on experience as a Data Engineer or in a related data-driven role. Strong experience with ETL tools like Apache Airflow, Talend, or Informatica. Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). Strong proficiency in Python, Scala, or Java for data manipulation and pipeline development. Experience with cloud-based platforms (AWS, Google Cloud, Azure) and their data services (e.g., S3, Redshift, BigQuery). Familiarity with big data processing frameworks such as Hadoop, Spark, or Flink. Experience in data warehousing concepts and building data models (e.g., Snowflake, Redshift). Understanding of data governance, data security best practices, and data privacy regulations (e.g., GDPR, HIPAA). Familiarity with version control systems like Git.. HiLabs is an equal opportunity employer (EOE). No job applicant or employee shall receive less favorable treatment or be disadvantaged because of their gender, marital or family status, color, race, ethnic origin, religion, disability, or age; nor be subject to less favorable treatment or be disadvantaged on any other basis prohibited by applicable law. HiLabs is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce to support individual growth and superior business results. Thank you for reviewing this opportunity with HiLabs! If this position appears to be a good fit for your skillset, we welcome your application. HiLabs Total Rewards Competitive Salary, Accelerated Incentive Policies, H1B sponsorship, Comprehensive benefits package that includes ESOPs, financial contribution for your ongoing professional and personal development, medical coverage for you and your loved ones, 401k, PTOs & a collaborative working environment, Smart mentorship, and highly qualified multidisciplinary, incredibly talented professionals from highly renowned and accredited medical schools, business schools, and engineering institutes. CCPA disclosure notice - https://www.hilabs.com/privacy
Posted 4 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Expert - Cloud Architecture - IN The role is responsible for end to end solutioning which includes front-end platforms, servers, storage, delivery and networks required to manage cloud storage. Key Responsibilities And Duties They are responsible for overseeing and consulting a company's cloud computing strategy which includes cloud adoption plans, cloud application design, cloud management, monitoring cloud suitability assessment and near term migration plan from the client's existing application/infrastructure platform to a cloud based solution. Expert knowledge and understanding of Salesforce.com, Microsoft Office 365, Windows Azure, Google App Engine and Amazon Web Services is desired. Educational Requirements University (Degree) Preferred Work Experience 5+ Years Required; 7+ Years Preferred Physical Requirements Physical Requirements: Sedentary Work Career Level 9IC Enterprise/Solution Architect - Client and Marketing Services Enterprise architect supporting our Client Services and Market Technology Teams. The role will work with our business, product and engineering teams to identify and understand requirements and put in place a formal structure for solution designs, ensuring our solutions meet day to day and future enterprise needs. The role will work alongside our technology leads to implement solutions that align to our common enterprise goals, sitting across an architectural domain and supporting a portfolio of products as part of our overall Nuveen Technology strategy for Client Services. Supporting our Client and Marketing services technology stack as we mature our capabilities and better leverage our modern tools. We are increasingly focused on how we ensure our clients are at the center of what we do, better understand their needs and taking a digital approach to how we interact with them. We are looking for someone to join our dynamic team as a senior architect supporting our journey. The role will be part of the wider Nuveen Architecture team that supports across our Nuveen Technology stack. Ideal Candidate: Leads the design of complex architecture requirements, from requirements analysis through to solution. Manages the design process inclusive of functions, capabilities and performance, understanding both technology and compliance requirements required by the enterprise. Experience working with product and engineering teams through an agile and iterative approach to design and delivery. Has worked as part of and with large global teams across multiple markets. Delivered projects where solutions are both off-the-shelf solutions and where custom applications have been built out. Implements strategies and activities to deliver solutions requiring integration of multiple platforms, operating systems and applications across the enterprise. Liaises with senior leaders to discuss the design of the technical architecture clearly to other leaders within the organization. Able to demonstrate experience and understanding of modern architectural patterns and design practice. Enjoys working in a dynamic work environment where innovation is at the center of how we work. Required Skills: 5+ years as a Enterprise Architect focused on designing and integrating enterprise class solutions, both custom build and off the shelf. A background as an engineer primarily experienced with client and marketing platforms, Salesforce - Sales Cloud, Service Cloud, Marketing Cloud, SalesPage, Adobe Experience Manager, Sitecore, Snowflake, Talend, Demandbase. Experience working with Asset Management or Financial services industry and across various lines of business. University Degree Preferred Skills: Currently a practicing enterprise architect. A practicing user of AWS services, across both the application and data stack, having completed and deployed projects within the past 12 months. Experience Designing end to end solutions. Experience working with Senior stake holders. Specific experience of the middle office technology stack and best practices implementations across the asset management industry Related Skills Application Programming Interface (API) Development/Integration, Automation, Communication, Consultative Communication, Containerization, DevOps, Enterprise Application Integration, Influence, Organizational Savviness, Problem Solving, Prototyping, Relationship Management, Scalability/Reliability, Software Development Life Cycle, Systems Design/Analysis _____________________________________________________________________________________________________ Company Overview TIAA Global Capabilities was established in 2016 with a mission to tap into a vast pool of talent, reduce risk by insourcing key platforms and processes, as well as contribute to innovation with a focus on enhancing our technology stack. TIAA Global Capabilities is focused on building a scalable and sustainable organization , with a focus on technology , operations and expanding into the shared services business space. Working closely with our U.S. colleagues and other partners, our goal is to reduce risk, improve the efficiency of our technology and processes and develop innovative ideas to increase throughput and productivity. We are an Equal Opportunity Employer. TIAA does not discriminate against any candidate or employee on the basis of age, race, color, national origin, sex, religion, veteran status, disability, sexual orientation, gender identity, or any other legally protected status. Accessibility Support TIAA offers support for those who need assistance with our online application process to provide an equal employment opportunity to all job seekers, including individuals with disabilities. If You Are a U.S. Applicant And Desire a Reasonable Accommodation To Complete a Job Application Please Use One Of The Below Options To Contact Our Accessibility Support Team: Phone: (800) 842-2755 Email: accessibility.support@tiaa.org Privacy Notices For Applicants of TIAA, Nuveen and Affiliates residing in US (other than California), click here. For Applicants of TIAA, Nuveen and Affiliates residing in California, please click here. For Applicants of TIAA Global Capabilities, click here. For Applicants of Nuveen residing in Europe and APAC, please click here. , Enterprise/Solution Architect - Client and Marketing Services Enterprise architect supporting our Client Services and Market Technology Teams. The role will work with our business, product and engineering teams to identify and understand requirements and put in place a formal structure for solution designs, ensuring our solutions meet day to day and future enterprise needs. The role will work alongside our technology leads to implement solutions that align to our common enterprise goals, sitting across an architectural domain and supporting a portfolio of products as part of our overall Nuveen Technology strategy for Client Services. Supporting our Client and Marketing services technology stack as we mature our capabilities and better leverage our modern tools. We are increasingly focused on how we ensure our clients are at the center of what we do, better understand their needs and taking a digital approach to how we interact with them. We are looking for someone to join our dynamic team as a senior architect supporting our journey. The role will be part of the wider Nuveen Architecture team that supports across our Nuveen Technology stack. Preferred Skills: Currently a practicing enterprise architect. A practicing user of AWS services, across both the application and data stack, having completed and deployed projects within the past 12 months. Experience Designing end to end solutions. Experience working with Senior stake holders. Specific experience of the middle office technology stack and best practices implementations across the asset management industry Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Job Description: Data Pipeline Developer (L1 Data Ops Analyst) Role Description: This role involves developing and maintaining data pipelines for ingesting, wrangling, transforming, and joining data from multiple sources. The candidate will work independently to deliver high-quality code and ensure seamless operation of data systems. This position requires proficiency in ETL tools such as Informatica, Glue, Databricks, and DataProc, along with strong coding skills in Python, PySpark, and SQL. The candidate should be capable of operating with minimal supervision while adhering to established processes and guidelines. Responsibilities Independently develop, test, and maintain data processing pipelines ensuring performance and scalability. Monitor dashboards and databases to ensure smooth operation during a 9-hour shift. Identify, troubleshoot, and escalate system or data anomalies based on predefined thresholds. Follow Standard Operating Procedures (SOPs) and runbooks to resolve basic data issues. Collaborate with L2 and L3 support teams for issue escalation and resolution. Write and execute advanced SQL queries for data validation, extraction, and troubleshooting. Document all development processes, including code, testing procedures, and operational logs. Ensure adherence to engineering standards, timelines, and Service Level Agreements (SLAs). Conduct unit tests to maintain data integrity and quality. Estimate time, effort, and resource requirements for assigned tasks. Participate in knowledge-sharing activities and contribute to project documentation. Ensure compliance with release management processes for seamless deployments. Obtain and maintain relevant technical and domain certifications (e.g., Azure, AWS, GCP). Mandatory Skills Proficiency in Python, PySpark, and SQL for data manipulation and transformation. Hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, DataProc). Strong knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud) and data-related services. Ability to conduct tests on data pipelines and evaluate results against data quality standards. Familiarity with advanced SQL functions, including windowing and analytical functions. Strong analytical and problem-solving skills for diagnosing data-related issues. Good-to-Have Skills Experience with Apache Airflow, Talend, or other ETL orchestration tools. Knowledge of data schemas, models, and data governance practices. Understanding of low-level design (LLD) and ability to map it to user requirements. Experience with MS Excel for data analysis and reporting. Soft Skills Strong written and verbal communication skills in English. Ability to work independently with minimal supervision. High attention to detail and ability to manage multiple monitoring tasks. Adaptability to work in a 24x7 shift schedule, including night shifts. Effective collaboration and communication with cross-functional teams and stakeholders. Experience Range 2 to 5 years of experience in data pipeline development and data operations. Hiring Locations: Chennai Trivandrum Kochi Measures Of Success Adherence to engineering standards, schedules, and SLAs. Reduction of recurring defects and quick resolution of production bugs. Accuracy and completeness of project documentation. Successful completion of required technical and domain certifications. Effective communication and issue resolution within support teams. Consistent delivery of optimized and error-free code. Skills Sql,Data Analysis,Ms Excel,Dashboards Show more Show less
Posted 4 weeks ago
6 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Position : ETL Support Engineer Experience : 6+ Years Location : Bengaluru Work Mode : Remote Employment Type : Contract Notice Period : Immediate - 15days Must Have Skills - Banking Domain, Git, Snowflake, ETL, JIRA/ServiceNow, SQL Job Responsibilities: · Understanding of the ETL process · Perform functional, Integration and Regression testing for ETL Processes. · Validate and ensure data quality and consistency across different data sources and targets. · Develop and execute test cases for ETL workflows and data pipeline. · Load Testing: Ensuring that the data warehouse can handle the volume of data being loaded and queried under normal and peak conditions. · Scalability: Testing for the scalability of the data warehouse in terms of data growth and system performance. External Skills And Expertise Required Qualification: · Bachelor's degree in Computer Science, Engineering, Mathematics or related discipline or its foreign equivalent · 6+ years of experience with ETL support and development · ETL Tools: Experience with popular ETL tools like Talend, Microsoft SSIS, · Experience with relational databases (e.g., SQL Server, Postgres). · Experience with Snowflake Dataware house. · Proficiency in writing complex SQL queries for data validation, comparison, and manipulation · Familiarity with version control systems like Git, Github to manage changes in test cases and scripts. · Knowledge of defect tracking tools like JIRA, ServiceNow. · Banking domain experience is a must. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Extensive experience in designing and maintaining data architectures, data processing pipelines at a large scale. Familiarity with databases and ETL processes along with data warehousing concepts Proven experience in designing and developing data pipelines using technologies like Apache Spark or Apache kafka or ETL tools like Talend/informatica or similar frameworks. Strong understanding of relational and NoSQL databases, data modelling and ETL processes. Assist in designing, developing and maintaining data pipelines. Collaborate with senior team members to understand data requirements and contribute to implementation of data solutions. Participate in troubleshooting and resolving data related issues. Monitoring dashboards and responding to alerts Lead the design and implementation of complex and scalable data pipelines. Proficiency in programming languages like Python, Pyspark and SQL. Experience with Cloud platforms preferably with AWS. Optimize and tune existing data platforms for enhanced performance and scalability. Experience with Pyspark or Python. Experience with ETL tools like Talend or informatica Ability to work in a fast-paced environment. Strong business acumen includes written and verbal communication skills. Strong interpersonal and organizational skills Trained in AWS technologies, and knowledge on ET tools like Talend and Informatica. Excellent verbal and non-verbal communication skills. Including the ability to communicate with people at all levels. Requires a Bachelor's Degree in a technical or equivalent discipline. This is a summary of the primary accountabilities and requirements for this position. The company reserves the right to modify or amend accountabilities and requirements at anytime at its sole discretion based on business needs. Any part of this job description is subject to possible modification to reasonably accommodate individuals with disabilities. About Us About us: Our story Mouser Electronics, founded in 1964, is a globally authorized distributor of semiconductors and electronic components for over 1,200 industry-leading manufacturer brands. This year marks the company's 60th anniversary. We specialize in the rapid introduction of the newest products and technologies targeting the design engineer and buyer communities. Mouser has 28 offices located around the globe. We conduct business in 23 different languages and 34 currencies. Our global distribution centre is equipped with state-of-the-art wireless warehouse management systems that enable us to process orders 24/7, and deliver nearly perfect pick-and-ship operations. Show more Show less
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.