Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
15 - 25 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.
Posted 17 hours ago
3.0 - 5.0 years
3 - 5 Lacs
Mumbai, Maharashtra, India
On-site
Design, develop, and implement data solutions using Azure Data Stack components. Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.
Posted 1 day ago
4.0 - 9.0 years
4 - 9 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary: Alike Thoughts Info Systems is seeking a skilled Informatica Product 360 Developer to support the design, development, and maintenance of our Product Information Management (PIM) solutions. This role will focus on leveraging Informatica Product 360 (P360) to ensure high-quality, consistent, and accurate product data across our systems. Key Responsibilities: Develop and configure solutions using Informatica Product 360 (P360) to facilitate efficient product onboarding, data enrichment, and syndication processes. Customize workflows, implement business rules, design user interfaces, and establish data validations within the P360 platform. Integrate Informatica P360 with various enterprise systems such as ERPs, e-commerce platforms, and external data sources, utilizing methods like APIs, ETL processes, and flat files. Collaborate closely with data stewards and business users to thoroughly understand product data requirements and translate them into robust technical designs. Ensure the delivery of high-quality, consistent, and accurate product data across all integrated systems.
Posted 1 day ago
5.0 - 10.0 years
5 - 10 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities: Design and develop visually appealing, interactive Power BI dashboards and reports based on business requirements. Connect to data sources, import data, and transform it using DAX . Collaborate with stakeholders to gather reporting requirements and translate them into technical solutions. Develop and maintain data models , including measures, calculated columns, and hierarchies. Optimize Power BI solutions for performance and usability. Implement row-level security and access control within Power BI. Required Qualifications: 5+ years of experience developing reports and dashboards with Power BI . Proficient in DAX (Data Analysis Expressions), Advanced Excel, VBA Macros, Power Point presentations and Power Query (M Language) . Strong understanding of data modeling , ETL processes , and reporting best practices . Experience working with various data sources (SQL Server, Excel, SharePoint, APIs, etc.).
Posted 1 day ago
5.0 - 8.0 years
12 - 20 Lacs
Thane, Maharashtra, India
On-site
Description We are seeking a skilled Data Mapping professional to join our team in India. The ideal candidate will be responsible for conducting data mapping activities, ensuring accurate data integration and transformation across various systems. This role requires a strong analytical mindset and the ability to collaborate effectively with cross-functional teams. Responsibilities Conduct data mapping activities to ensure accurate data integration and transformation across systems. Analyze data sources and create detailed mapping specifications for ETL processes. Collaborate with cross-functional teams to understand data requirements and provide solutions. Document data mapping processes and maintain mapping documentation for future reference. Validate and test data mappings to ensure data quality and integrity throughout the data lifecycle. Skills and Qualifications 5-8 years of experience in data mapping, data analysis, or related field. Proficiency in SQL and experience with database management systems. Familiarity with data integration tools such as Informatica, Talend, or Microsoft SSIS. Strong understanding of data modeling concepts and best practices. Experience with ETL processes and data warehousing solutions. Knowledge of data governance and data quality principles. Excellent analytical and problem-solving skills. Strong communication and collaboration skills to work with diverse teams.
Posted 1 day ago
5.0 - 8.0 years
13 - 20 Lacs
Ahmedabad, Gujarat, India
On-site
Description We are seeking an experienced Data Operator to join our team in India. The successful candidate will be responsible for managing and maintaining data, ensuring its accuracy and availability for various business needs. This role requires strong attention to detail and a commitment to maintaining high standards in data management. Responsibilities Enter and maintain data in databases and systems accurately and efficiently. Verify and validate data for accuracy and completeness. Generate reports and summaries based on data analysis. Assist in data cleaning and preparation for analysis. Collaborate with other departments to ensure data integrity and consistency. Skills and Qualifications Proficient in Microsoft Excel and data entry software. Strong attention to detail and accuracy. Good analytical and problem-solving skills. Familiarity with database management systems (e.g., SQL, Oracle). Basic knowledge of data visualization tools (e.g., Tableau, Power BI) is a plus. Excellent communication skills, both verbal and written. Ability to work independently and as part of a team.
Posted 1 day ago
4.0 - 10.0 years
3 - 11 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities: Lead the design, development, and implementation of ETL processes using Azure Data Factory (ADF). Develop and maintain complex SQL queries and perform SQL tuning. Create and manage interactive dashboards and reports using Power Bl. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Provide mentorship and guidance to junior developers. Ensure data quality and integrity throughout the ETL process. Participate in code reviews and ensure adherence to best practices. Requirements: Experience: 4-10 years of IT experience with a strong background in ETL processes and data warehousing. Skills: Proficient in Azure Data Factory (ADF). Strong hands-on experience with Power Bl. Advanced SQL skills with the ability to handle complex queries and SQL tuning. Domain Knowledge: Preferably experienced in the insurance domain. Soft Skills: Strong problem-solving skills, excellent communication abilities, and the capability to work collaboratively in a team environment.
Posted 1 day ago
4.0 - 10.0 years
3 - 15 Lacs
Hyderabad, Telangana, India
On-site
ADF and Power Bl Key Responsibilities: o Lead the design, development, and implementation of ETL processes using Azure Data Factory (ADF). o Develop and maintain complex SQL queries and perform SQL tuning. o Create and manage interactive dashboards and reports using Power Bl. o Collaborate with stakeholders to gather requirements and translate them into technical solutions. o Provide mentorship and guidance to junior developers. o Ensure data quality and integrity throughout the ETL process. o Participate in code reviews and ensure adherence to best practices. Requirements: o Experience: 4-10 years of IT experience with a strong background in ETL processes and data warehousing. Skills: o Proficient in Azure Data Factory (ADF). o Strong hands-on experience with Power Bl. o Advanced SQL skills with the ability to handle complex queries and SQL tuning. o Domain Knowledge: Preferably experienced in the insurance domain. o Soft Skills: Strong problem-solving skills, excellent communication abilities, and the capability to work collaboratively in a team environment.
Posted 1 day ago
4.0 - 10.0 years
2 - 13 Lacs
Delhi, India
On-site
ADF and Power Bl Key Responsibilities: o Lead the design, development, and implementation of ETL processes using Azure Data Factory (ADF). o Develop and maintain complex SQL queries and perform SQL tuning. o Create and manage interactive dashboards and reports using Power Bl. o Collaborate with stakeholders to gather requirements and translate them into technical solutions. o Provide mentorship and guidance to junior developers. o Ensure data quality and integrity throughout the ETL process. o Participate in code reviews and ensure adherence to best practices. Requirements: o Experience: 4-10 years of IT experience with a strong background in ETL processes and data warehousing. Skills: o Proficient in Azure Data Factory (ADF). o Strong hands-on experience with Power Bl. o Advanced SQL skills with the ability to handle complex queries and SQL tuning. o Domain Knowledge: Preferably experienced in the insurance domain. o Soft Skills: Strong problem-solving skills, excellent communication abilities, and the capability to work collaboratively in a team environment.
Posted 1 day ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Telangana, India
On-site
Job description Job Summary: We are looking for a Big Data Developer with expertise in Apache Spark, the Hadoop ecosystem. Required Skills: 5+ years of experience in big data technologies . Strong programming skills in Java (Scala/Python is a plus). Hands-on experience with Spark and Hadoop (HDFS, Hive, HBase, YARN). Understanding of Spark performance tuning and optimization techniques. Experience with real-time data streaming (Spark Streaming, Kafka). Education: Bachelors or Master's degree in computer science, Data Engineering, or a related field. Role: IT & Information Security - Other Industry Type: IT Services & Consulting Department: IT & Information Security Employment Type: Full Time, Permanent Role Category: IT & Information Security - Other Education UG: Diploma in Any Specialization, B.Tech/B.E. in Any Specialization PG: MBA/PGDM in Any Specialization, MCA in Any Specialization, PG Diploma in Any Specialization, M.Tech in Any Specialization
Posted 1 day ago
7.0 - 12.0 years
15 - 22 Lacs
Chennai
Hybrid
Key Responsibilities: Migrate and optimize existing data pipelines from Snowflake to Databricks . Develop and maintain efficient ETL workflows using PySpark and SQL . Design scalable and performance-optimized data processing solutions in Databricks . Troubleshoot and resolve data pipeline issues , ensuring accuracy and reliability. Work independently to analyze requirements, propose solutions, and implement them effectively. Collaborate with stakeholders to understand business requirements and ensure a seamless transition. Optimize Spark performance tuning for large-scale data processing. Maintain proper documentation for migrated pipelines and workflows. Required Qualifications: Proficiency in Python, SQL, and Apache Spark (PySpark preferred). Experience with Databricks and Snowflake , including pipeline development and optimization. Strong understanding of ETL processes, data modeling, and distributed computing . Ability to work independently and manage multiple tasks in a fast-paced environment. Hands-on experience with orchestration tools (e.g., Airflow, Databricks Workflows). C loud platforms (AWS) is a Mandatory.
Posted 1 day ago
5.0 - 10.0 years
8 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
HANA Modeling: Design and develop HANA data models, including attribute views, analytic views, and calculation views. Performance Optimization: Analyze and optimize HANA database performance, identifying bottlenecks and implementing solutions. Data Integration: Work with various data sources to integrate data into SAP HANA, ensuring data quality and consistency. SQL Scripting: Develop and maintain SQL scripts for data manipulation and reporting. ETL Processes: Design and implement ETL processes to load data into SAP HANA from various sources. Collaboration: Work closely with business analysts and other stakeholders to understand requirements and translate them into technical solutions. Documentation: Document technical designs, coding standards, and maintenance procedures. Support: Provide technical support and troubleshooting for SAP HANA-related issues. SAP HANA HANA Modeling, Performance Optimization, Data Integration, SQL Scripting, ETL Processes, SAP NetWeaver, ABAP, ABAP Reporting,LSMW,Migration Tools,BRF+,BADI Implementations,FPM,Interface Controller,Object-Oriented ABAP Programming, SAP Workflow Management, MDG Implementation Projects,Enhancing Data Modeling,Enhancing Process Modeling, Enhancing UI Modeling, Enhancing Replication Custom Reports, Classes and Methods, Idoc/SOA/Webservices Location : - Remote
Posted 1 day ago
8.0 - 13.0 years
3 - 18 Lacs
Hyderabad, Telangana, India
On-site
Your specific responsibilities will include: Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education: B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required experience: 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (eg Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred experience: Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science andvisualization/reportingproducts, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI) Required Skills: Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model
Posted 2 days ago
5.0 - 8.0 years
3 - 18 Lacs
Pune, Maharashtra, India
On-site
Your specific responsibilities will include: Hands-on development of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education: B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required experience: 5+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Good understanding and comprehension of the requirements provided by Data Product Owner and Lead Analytics Engineer Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred experience: Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science andvisualization/reportingproducts, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI)
Posted 2 days ago
6.0 - 12.0 years
9 - 25 Lacs
Hyderabad, Telangana, India
On-site
Description We are looking for an experienced SAP BODS developer to join our team in India. The ideal candidate will have a strong background in ETL processes and data integration, with a proven track record of successfully managing complex data transformation projects. Responsibilities Designing and developing ETL processes using SAP BODS (Business Objects Data Services). Data profiling, cleansing, and transformation to ensure data quality. Collaborating with business analysts and stakeholders to gather requirements and translate them into technical specifications. Optimizing and tuning ETL processes for performance improvements. Creating and maintaining documentation for ETL processes and data flows. Troubleshooting and resolving issues related to data integration and ETL processes. Participating in code reviews and ensuring adherence to best practices in development. Skills and Qualifications 6-12 years of experience in SAP BODS or related ETL tools. Strong knowledge of data warehousing concepts and methodologies. Proficiency in SQL and database management systems (e.g., Oracle, SQL Server, MySQL). Experience with data migration and integration projects. Familiarity with data modeling and data architecture. Ability to work with various data sources including flat files, databases, and web services. Strong analytical and problem-solving skills. Excellent communication and collaboration skills.
Posted 2 days ago
3.0 - 5.0 years
12 - 14 Lacs
Mumbai
Work from Office
Design, develop, and implement data solutions using Azure Data Stack components. Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.
Posted 2 days ago
5.0 - 10.0 years
11 - 16 Lacs
Pune
Work from Office
Design, develop, and implement data solutions using Azure Data Stack components .Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.
Posted 2 days ago
4.0 - 5.0 years
12 - 14 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
ETL Process Design: Designing and developing ETL processes using Talend for data integration and transformation. Data Extraction: Extracting data from various sources, including databases, APIs, and flat files. Data Transformation: Transforming data to meet business requirements and ensuring data quality. Data Loading: Loading transformed data into target systems, such as data warehouses or data lakes. Job Scheduling: Scheduling and automating ETL jobs using Talend's scheduling tools. Performance Optimization: Optimizing ETL workflows for efficiency and performance. Error Handling: Implementing robust error handling and logging mechanisms in ETL processes. Data Profiling: Performing data profiling to identify data quality issues and inconsistencies. Documentation: Documenting ETL processes, data flow diagrams, and technical specifications. Collaboration with Data Teams: Working closely with data analysts, data scientists, and other stakeholders to understand data requirements. Min 4 to Max 7yrs of Relevant exp. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 days ago
6.0 - 8.0 years
3 - 15 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities: Trusted Advisor: Provide direction to clients in different verticals on their selection of data and ML platforms, data strategies, expected impact, and relevant tools and methodologies that they can deploy to accelerate Big Data and advanced analytics programs. Pre-Sales and Consulting: Lead and drive technology consulting engagements and pre-sales, architecture assessments, and discovery workshops. Plan data engineering programs of Fortune-1000 enterprises and lead a team of principal consultants and data engineers who work on client accounts and drive consulting engagements to success. Technology Strategy and R&D: Create and bring to the market solutions in data engineering, MLOps, data governance. Responsible for driving innovation through research and development activities on industry trends, defining Go-To-Market strategies, developing assets and solutions strategy across multiple industries. Engineering: Working with Grid Dynamics s delivery organization to ensure that the right tools, technologies, and processes are in place across the firm to transform and continuously improve the quality and efficiency of our clients data platforms and data management processes. Business Development & Partnership: Manage relationships with key technology partners(AWS, GCP, Azure) and industry analysts. Requirements: Extensive practical experience in Big Data engineering, data governance, and cloud data platforms. Strong understanding of cloud-based architectures for data collection, data aggregation, reporting, and BI. Strong understanding of ML platforms and tooling including open-source, cloud native, and proprietary. Deep domain knowledge in at least one of the following industries: Retail, Finance, Manufacturing, Healthcare, Life Sciences. Experience in managing and delivering sophisticated analytical and data engineering programs at the enterprise scale. Managed key client relations worldwide and advised global technology and business leaders on innovation and data strategy. Experience with Big 4 consulting is a plus.
Posted 2 days ago
3.0 - 5.0 years
3 - 18 Lacs
Bengaluru, Karnataka, India
On-site
Your key responsibilities Design & Develop Agentic AI Applications Utilize frameworks like Lang Chain, Crew AI, and Auto Gen to build autonomous agents capable of complex task execution. Implement RAG Pipelines Integrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models Customize LLMs and SLMs using domain-specific data to improve performance and relevance in specialized applications. NER Models Train OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge Graphs Construct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimize AI Workflows Employ M LOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring Your skills and experience 15+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as Tensor Flow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs, and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or ecommerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for crossfunctional collaboration. Ability to work independently and manage multiple projects simultaneous
Posted 3 days ago
2.0 - 7.0 years
2 - 5 Lacs
Delhi, India
On-site
Experience: 4-10 years of experience in digital operations, digital marketing, or campaign management. Technical Proficiency: Experience working with digital platforms such as Google Ads, Facebook Ads, and programmatic advertising platforms. Familiarity with digital analytics tools (e.g., Google Analytics, Adobe Analytics) and data visualization tools (e.g., Tableau, Power BI). Basic understanding of HTML, JavaScript, and ad serving technologies. Analytical Skills: Strong ability to analyze csampaign performance, identify trends, and make data-driven recommendations to improve effectiveness. Process-Oriented: Ability to manage and optimize digital processes to ensure efficiency and consistency. Communication: Excellent written and verbal communication skills, with the ability to explain technical details to non-technical stakeholders. Attention to Detail: High attention to detail, ensuring that all digital campaigns meet quality and performance standards.
Posted 3 days ago
3.0 - 10.0 years
5 - 24 Lacs
Delhi, India
On-site
Description We are seeking a skilled Data Administrator to join our team in India. The ideal candidate will be responsible for managing and maintaining our data infrastructure, ensuring data integrity, and optimizing data-related processes. Responsibilities Manage and maintain data integrity and accuracy across various databases and systems. Develop and implement data management strategies and procedures. Monitor database performance and conduct regular audits to ensure data compliance. Collaborate with IT and other departments to optimize data-related processes and workflows. Prepare and present data reports and analytics to stakeholders. Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. 3-10 years of experience in data administration or a related field. Proficiency in database management systems such as SQL, Oracle, or MySQL. Strong analytical skills and attention to detail. Experience with data visualization tools like Tableau or Power BI is a plus. Familiarity with data governance and compliance frameworks.
Posted 3 days ago
3.0 - 5.0 years
3 - 18 Lacs
Bengaluru, Karnataka, India
On-site
Role Description Its a popular perception that if you have experience in Trade Finance Operations, you are never out of job. We handle multiple products like Letter of Credit, Collections, Bank Guarantees etc. Depending on your appetite to learn, you will get enough opportunities to learn multiple products/processes. The learning never ends in Trade Finance Operations. Our subject matter experts will ensure that you get the necessary training on the products and processes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Handle the day-to-day processing of Collections, Letter of credit and Bank Guarantees as part of trade operation team in Delivery Hub, to meet agreed customer service level agreements and review outstanding transactions. Manage and ensure compliance (KOP, Ops manual etc.) with internal policies and audit and regulatory requirements Support and achieve excellent partnership with branch operations, and respective sales staff Your skills and experience Possesses adequate understanding of Trade related rules and guidelines as commissioned by ICC (ICC/UCP/URC etc) Good understanding of legal, credit and operational risks in handling of Trade product/services Good communication skills (oral and written) Flexible to work in late night shifts.
Posted 3 days ago
0.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328407 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Tableau Admin with AWS Experience to join our team in NOIDA, Uttar Pradesh (IN-UP), India (IN). Tableau Admin with AWS Experience: Job Description: We are seeking a skilled Tableau Administrator with experience in AWS to join our team. The ideal candidate will be responsible for managing and optimizing our Tableau Server environment hosted on AWS, ensuring efficient operation, data security, and seamless integration with other data sources and analytics tools. Key Responsibilities: - Manage, configure, and administer Tableau Server on AWS, including setting up sites and managing user access and permissions. - Monitor server activity/performance, conduct regular system maintenance, and troubleshoot issues to ensure optimal performance and minimal downtime. - Collaborate with data engineers and analysts to optimize data sources and dashboard performance. - Implement and manage security protocols, ensuring compliance with data governance and privacy policies. - Automate monitoring and server management tasks using AWS and Tableau APIs. - Assist in the design and development of complex Tableau dashboards. - Provide technical support and training to Tableau users. - Stay updated on the latest Tableau and AWS features and best practices, recommending and implementing improvements. Qualifications: - - Proven experience as a Tableau Administrator, with strong skills in Tableau Server and Tableau Desktop. - Experience with AWS, particularly with services relevant to hosting and managing Tableau Server (e.g., EC2, S3, RDS). - Familiarity with SQL and experience working with various databases. - Knowledge of data integration, ETL processes, and data warehousing principles. - Strong problem-solving skills and the ability to work in a fast-paced environment. - Excellent communication and collaboration skills. - Relevant certifications in Tableau and AWS are a plus. A Tableau Administrator, also known as a Tableau Server Administrator, is responsible for managing and maintaining Tableau Server, a platform that enables organizations to create, share, and collaborate on data visualizations and dashboards. Here's a typical job description for a Tableau Admin: 1. Server Administration : Install, configure, and maintain Tableau Server to ensure its reliability, performance, and security. 2. User Management : Manage user accounts, roles, and permissions on Tableau Server, ensuring appropriate access control. 3. Security : Implement security measures, including authentication, encryption, and access controls, to protect sensitive data and dashboards. 4. Data Source Connections : Set up and manage connections to various data sources, databases, and data warehouses for data extraction. 5. L icense Management: Monitor Tableau licensing, allocate licenses as needed, and ensure compliance with licensing agreements. 6. Backup and Recovery : Establish backup and disaster recovery plans to safeguard Tableau Server data and configurations. 7. Performance Optimization : Monitor server performance, identify bottlenecks, and optimize configurations to ensure smooth dashboard loading and efficient data processing. 8. Scaling : Scale Tableau Server resources to accommodate increasing user demand and data volume. 9. Troubleshooting : Diagnose and resolve issues related to Tableau Server, data sources, and dashboards. 10. Version Upgrades : Plan and execute server upgrades, apply patches, and stay current with Tableau releases. 11. Monitoring and Logging : Set up monitoring tools and logs to track server health, user activity, and performance metrics. 12. Training and Support : Provide training and support to Tableau users, helping them with dashboard development and troubleshooting. 13. Collaboration : Collaborate with data analysts, data scientists, and business users to understand their requirements and assist with dashboard development. 14. Documentation : Maintain documentation for server configurations, procedures, and best practices. 15. Governance : Implement data governance policies and practices to maintain data quality and consistency across Tableau dashboards. 16. Integration : Collaborate with IT teams to integrate Tableau with other data management systems and tools. 17. Usage Analytics : Generate reports and insights on Tableau usage and adoption to inform decision-making. 18. Stay Current : Keep up-to-date with Tableau updates, new features, and best practices in server administration. A Tableau Administrator plays a vital role in ensuring that Tableau is effectively utilized within an organization, allowing users to harness the power of data visualization and analytics for informed decision-making. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 3 days ago
6.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnapLogic Professionals in the following areas : Job Description: Experience: 6 to 8 years. Configuring and deploying Snaplogic pipelines to integrate data from various sources. Trouble shooting and resolving issues related to ETL processes. Developing and maintaining user documentation for ETL processes. Resource to work from CAT office 3 days a week. Overall, 6-8 yrs experience Proven 3+ experience in building and designing solutions for data warehouse and experience in working with large data sets 3-4 years of Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts. Analyze & translate functional specifications /user stories into technical specifications. Performs a Sr.Develop er role in end to end implementations in Snap logic Strong database knowledge, i.e., RDBMS Oracle/PLSQL, Snowflake Proven experience with Cloud data storage and access using Snowflake / S3 Experienced in business interfacing, possess strong data background, and good understanding in requirements analysis design Data movement and ETL experience Experience with AWS/Azure cloud environment development and deployment Knowledge of API's and in any scripting is a plus Note : Resource should be able to provide Technical guidance and Mentorship to development teams along with Team leads. Review and optimize existing pipelines for performance and efficiency Collaborate with stakeholders to understand business requirements and turn them to technical solutions. Please let know if you need any further details on this. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane