Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The PL/SQL Developer plays a critical role in supporting the development, enhancement, and maintenance of database applications within our organization. This position is vital for ensuring the integrity of database systems and systems" performance through effective programming and optimization of PL/SQL code. The ideal candidate will leverage their expertise in relational database technologies to design and implement complex SQL queries, stored procedures, and triggers, thereby enabling efficient data retrieval and manipulation. By closely collaborating with software engineers and data analysts, the PL/SQL Developer facilitates seamless integration of database functionalities with application workflows. Contributing to the overall data strategy, this role is not only centered on coding but also involves identifying and resolving performance issues. In a dynamically changing technological environment, the PL/SQL Developer must maintain knowledge of industry best practices, continuously improving their skills to deliver robust database solutions. Design and develop PL/SQL scripts for data manipulation and retrieval. Write efficient and optimized SQL queries to enhance performance. Develop stored procedures, functions, and triggers to automate processes. Conduct thorough debugging and troubleshooting of PL/SQL code. Implement database performance tuning strategies. Collaborate with application developers to integrate database solutions. Maintain documentation of database structures, code changes, and updates. Conduct code reviews and provide constructive feedback to peers. Support data migration and data cleansing activities. Work closely with business stakeholders to understand data requirements. Monitor database performance and implement improvements as needed. Enhance existing PL/SQL applications for improved efficiency. Stay updated with new database technologies and best practices. Participate in disaster recovery and data backup procedures. Ensure compliance with data governance policies and practices. Bachelor's degree in Computer Science, Information Technology, or related field. Minimum of 3 years of experience in PL/SQL development. Strong knowledge of Oracle databases and PL/SQL programming. Proficient in SQL and database design principles. Experience with performance tuning and optimization techniques. Familiarity with database management tools and software. Ability to write complex queries and stored procedures. Knowledge of data modeling concepts and best practices. Experience in working within an Agile development environment. Strong analytical and problem-solving skills. Excellent communication and team collaboration abilities. Experience with version control systems (e.g., Git, SVN). Proven ability to deliver projects within deadlines. Knowledge of additional programming languages (e.g., Java, Python) is a plus. Experience with ETL processes or data warehousing solutions is a benefit.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
About the Role: As a Senior ETL Developer at KPMG in India, you will be responsible for leveraging your expertise in Informatica Intelligent Cloud Services (IICS), ETL processes, Data Warehousing (DWH), and SQL to design, develop, and manage ETL workflows. Your role will involve implementing and optimizing data integration solutions, writing and optimizing SQL queries, ensuring data quality and integrity, and collaborating with data architects and business teams to meet data requirements. Additionally, you will be troubleshooting and optimizing ETL processes for scalability and performance improvements while maintaining data governance, compliance, and security standards. Key Responsibilities: - Design, develop, and manage ETL workflows using Informatica IICS. - Implement and optimize data integration solutions for enterprise data warehousing (DWH). - Write and optimize SQL queries for data extraction, transformation, and reporting. - Ensure data quality, consistency, and integrity across systems. - Collaborate with data architects and business teams to understand data requirements. - Troubleshoot and optimize ETL processes for scalability and performance improvements. - Maintain data governance, compliance, and security standards. Required Skills & Qualifications: - 4-6 years of experience in ETL development and data integration. - Strong hands-on experience with Informatica IICS for cloud-based ETL solutions. - Proficiency in SQL for querying, data transformation, and database optimization. - Solid understanding of data warehousing (DWH) principles and best practices. - Experience in performance tuning, troubleshooting, and optimization of ETL workflows. Preferred Qualifications: - Experience with cloud platforms (AWS, Azure, or GCP) for data integration. - Knowledge of big data technologies like Spark, Hadoop, or Databricks. - Exposure to BI tools (Power BI, Tableau) for data visualization. Qualifications: - B.TECH/M.TECH/MCA/M.SC,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
jaipur, rajasthan
On-site
As a SQL Developer, you will be responsible for designing, developing, and maintaining SQL databases to support business operations and analytics. You will work with various data sources, ETL processes, and data visualization tools to deliver actionable insights. Key Responsibilities include analyzing data to identify trends and insights, developing and optimizing data models and SQL queries, managing and integrating data from multiple sources using data warehousing solutions, utilizing ODI for ETL processes to ensure data integrity, creating accurate and visually appealing reports with actionable insights, providing production support for data systems to ensure reliability, troubleshooting data issues and system errors, collaborating with IT, business stakeholders, and analysts, documenting data processes, reports, and dashboards, and being available for 24/7 shifts. Required Skills And Experience: - Strong proficiency in SQL, including T-SQL and PL/SQL. - Experience with database design and normalization. - Knowledge of ETL processes and tools (e.g., ODI). - Experience with data warehousing concepts and data modeling. - Understanding of data visualization tools (e.g., Power BI, Tableau). - Good problem-solving and analytical skills. - Strong attention to detail. - Ability to work independently and as part of a team. Qualifications: - Bachelor's in Computer Science, IT, Data Science, or equivalent,
Posted 1 week ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You have an exciting opportunity as an Associate Technical Architect-AWS with 8-10 years of experience. As an Associate Technical Architect-AWS, you will be responsible for working with AWS Architect, AWS Glue or Databricks, PySpark, and Python. Your role will involve hands-on experience with AWS Glue or Databricks, PySpark, and Python, with a minimum of 2 years of expertise in PySpark and AWS Cloud. Additionally, you will be expected to have hands-on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch, and proficiency in crafting low-level designs for data warehousing solutions on AWS cloud. Your proven track record in implementing big-data solutions within the AWS ecosystem, including Data Lakes, will be valuable. Moreover, familiarity with data warehousing, data quality assurance, and monitoring practices, as well as the ability to construct scalable data pipelines and ETL processes, will be essential for this role. Experience with DevOps environments, data security services, data modeling, integration, and design principles are also desired. Strong communication and analytical skills are a must for this role, along with being a dedicated team player with a goal-oriented mindset. Your commitment to delivering quality work with attention to detail will be crucial in this position. This is a full-time, permanent position with a day shift schedule and the work location is in person at Pune/hybrid. If you meet the qualifications and are ready to take on this challenging role, we look forward to hearing from you.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Solution Architect specializing in Data & AI with over 8 years of experience, you will be a key player in leading and driving our data-driven transformation at Inferenz. This role requires designing and implementing cutting-edge AI and data solutions that align with business objectives. Working closely with cross-functional teams, you will create scalable, high-performance architectures utilizing modern technologies in data engineering, machine learning, and cloud computing. Your responsibilities will include architecting and designing end-to-end data and AI solutions to address business challenges and optimize decision-making. Defining and implementing best practices for data architecture, data governance, and AI model deployment will be crucial. Collaboration with data engineers, data scientists, and business stakeholders is essential to deliver scalable and high-impact AI-driven applications. Leading the integration of AI models with enterprise applications, ensuring seamless deployment and operational efficiency, is a key aspect of your role. Your expertise will be instrumental in evaluating and recommending the latest technologies in data platforms, AI frameworks, and cloud-based analytics solutions. Ensuring data security, compliance, and ethical AI implementation will be a top priority. Guiding teams in adopting advanced analytics, AI, and machine learning models for predictive insights and automation will also be part of your responsibilities. Additionally, driving innovation by identifying new opportunities for AI and data-driven improvements within the organization is expected. To excel in this role, you must possess 8+ years of experience in designing and implementing data and AI solutions. Strong expertise in cloud platforms such as AWS, Azure, or Google Cloud is required. Hands-on experience with big data technologies like Spark, Databricks, Snowflake, etc., is essential. Proficiency in TensorFlow, PyTorch, Scikit-learn, etc., is expected. A deep understanding of data modeling, ETL processes, and data governance frameworks is necessary. Experience in MLOps, model deployment, and automation is crucial. Proficiency in Generative AI frameworks is also required. Strong programming skills in Python, SQL, and Java/Scala (preferred) are essential. Familiarity with containerization and orchestration tools like Docker and Kubernetes is necessary. Excellent problem-solving skills and the ability to work in a fast-paced environment are key attributes. Strong communication and leadership skills, with the ability to drive technical conversations, are important for success in this role. Preferred qualifications include certifications in cloud architecture, data engineering, or AI/ML. Experience with generative AI and developing AI-driven analytics solutions for enterprises is beneficial. Familiarity with Graph RAG, building AI agents, and multi-agent systems is a plus. Additional certifications in AI/GenAI are preferred. Proven leadership skills are expected. If you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. Uplers is waiting for you!,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a DevOps Engineer specializing in data, you will be dedicated to implementing and managing our cloud-based data infrastructure utilizing AWS and Snowflake. Your primary responsibility will involve collaborating with data engineers, data scientists, and various stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data technology stacks, MLOps methodologies, automation, and information security will play a crucial role in improving our data pipelines and ensuring data integrity and availability. You should possess a Bachelor's degree in Computer Science, Engineering, or have at least 3 years of experience in a DevOps engineering role or a similar engineering position. A strong command of AWS services (e.g., EC2, S3, Lambda, RDS) and cloud infrastructure best practices is essential. Proficiency in Snowflake, including data modeling, performance tuning, and query optimization, is required. Experience with modern data technologies and tools (e.g., Apache Airflow, dbt, ETL processes) is also expected. Familiarity with MLOps frameworks and methodologies such as MLflow, Kubeflow, or SageMaker, as well as knowledge of containerization and orchestration tools like Docker and Kubernetes, will be beneficial. Proficiency in scripting languages such as Python, Ruby, PHP, and Perl, along with automation frameworks, is necessary. Additionally, a strong understanding of Git and GitHub workflows, databases, SQL, CI/CD tools and practices (e.g., Jenkins, GitLab CI), and information security principles is crucial. Excellent problem-solving skills, a collaborative team spirit, and strong communication skills, both verbal and written, are highly valued. Preferred qualifications include experience with data governance and compliance frameworks, familiarity with data visualization tools (e.g., Tableau, Looker), and knowledge of machine learning frameworks and concepts. Possessing relevant security certifications (e.g., CISSP, CISM, AWS Certified Security) is considered a plus. Your key responsibilities will include infrastructure management, data pipeline deployment, Snowflake administration, MLOps implementation, information security integration, CI/CD implementation, support and troubleshooting, tool development, automation and visualization, system maintenance, monitoring and performance tuning, collaboration with stakeholders, and documentation of data architecture and security protocols. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in multiple countries. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide, leveraging advanced AI technologies to enhance customer experiences and drive operational efficiency. We are committed to innovation and diversity, welcoming individuals from all backgrounds to join us in supporting our international clientele.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this role should have a strong background in SQL, CDP (TreasureData), Python/Dig-Dag, and Presto/Sql for data engineering. It is essential to possess knowledge and hands-on experience in cloud technologies such as Microsoft Azure, AWS, ETL processes, and API integration tools. Proficiency in Python and SQL is a must, along with exposure to Big Data technologies like Presto, Hadoop, Cassandra, MongoDB, etc. Previous experience with CDP implementation using tools like Treasure Data or similar platforms such as Action IQ would be a significant advantage. Familiarity with data modelling and architecture is preferred, as well as excellent SQL and advanced SQL skills. Knowledge or experience in data visualization tools like Power BI and an understanding of AI/ML concepts would be beneficial. The candidate should hold a BE/BTech degree and be actively involved in requirements gathering, demonstrating the ability to create technical documentation and possessing strong analytical and problem-solving skills. The role entails working on the end-to-end implementation of CDP projects, participating in CDP BAU activities, Go-live cut-over, and providing day-to-day CDP application support. Automation of existing tasks and flexibility with working hours are expected, along with the ability to thrive in a process-oriented environment. Please note that the above job description is a standard summary based on the provided information.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Are you ready to identify new opportunities and bring forth new process improvements to help automate the functions for new technology platforms Our team is a highly-collaborative inclusive environment where we value relationship-building, strategic thinking and solution-oriented members. Job Summary: As a Data Domain Architect Associate within the Analytics Solutions and Delivery (AS&D) team in Workforce Planning, you will be responsible for delivering data automation and visualization that supports multiple Workforce Planning functions. Projects engaged by the Analytics team can be complex, data-intensive, and of a high level of difficulty, having a significant impact on the business. Your role will require quick assessment and comprehension of unstructured problems to develop practical problem-solving strategies. Job Responsibilities: Contribute to data mining architectures, modeling standards, reporting, and data analysis methodologies and perform in depth analysis to resolve a business problem by employing the appropriate tools available in the enterprise Identify new opportunities to bring new process improvements and bring critical skillset to automate the functions on new technology platforms. Communicate data findings through engaging stories that are easy to understand Quickly assesses the big picture in complex situations and identify what is critical and implement rapid prototyping & deployment of existing capability to the target state platform. Work with business domain experts to identify data relevant for analysis Develop new analytical methods and/or tools as required Respond to and resolve data mining performance issues. Monitor data mining system performance and implement efficiency improvements Build test scenarios and assist in UAT. Develop and maintain comprehensive documentation for data processes and workflows. Required Qualifications, Capabilities, and Skills: Hands on experience for data pull using SQL, SQL tuning, data validation, dimensional and relational data modeling. Understanding and hands on experience of designing and implementing ETL processes Experience using Alteryx, delivering reports and insights, data analytics inference, and visualizations to communicate with partners and stakeholders Experience on Data Migration projects. (Snowflake/Databricks) Demonstrated proficiency with Microsoft Office (Excel, PowerPoint) Strong problem-solving skills and attention to detail. Preferred Qualifications, Capabilities, and Skills: Bachelor's Degree with 6+ years or Master's+ with 4+ years of experience operating as reporting and analytics professional with background in either Computer Science, MCA, Statistics, Engineering, Mathematics, Information Science, or related disciplines Experience with Python, PySpark, and microservices (AWS) is good to have Exposure to cloud (AWS/Azure) architecture and Data Lake is a plus
Posted 1 week ago
0.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Are you interested in part of a team to harness the power of data, make informed decisions, drive innovations and achieve competitive advantage in today's data driven business landscape Our team is a highly-collaborative inclusive environment where we value relationship-building, strategic thinking, and solution-oriented members. As a Data Domain Architect Associate Senior on the Analytics Solutions and Delivery (AS&D) Team within Workforce Planning, you will be tasked with making quick assessment and comprehension of unstructured problems to develop practical problem-solving strategies. You should have a strong background in data analysis, particularly with Alteryx workflows, and relational database skills to interpret and reverse engineer legacy processes. Excellent communication skills and the ability to work directly with end users are essential. Development experience is preferred and you should be capable of working independently in a fast-paced environment. Job Responsibilities: Analyze and interpret complex data sets using Alteryx/Databricks. Reverse engineer existing processes and workflows. Collaborate with end users to gather requirements and understand business needs. Develop and maintain comprehensive documentation for data processes and workflows. Work closely with cross-functional teams for successful project implementation. Provide insights and recommendations based on data analysis. Build test scenarios and assist in UAT. Work closely with the engineering team for execution. Work autonomously to achieve objectives and meet deadlines. Identify new opportunities for process improvements. Required Qualifications, Capabilities, and Skills: Technical knowledge of data management, governance, data architecture and big data platforms like AWS, Databricks and Snowflake. Understanding of data modeling concepts, including operational data vs. analytical data. Experience with ETL processes and other data-related workstreams. Bachelor's degree in Computer Science, Information Technology, Business Administration, or a related field. Proven experience as a Data/Business Analyst or similar role, with a focus on technical data analysis. Proficiency in Alteryx for data preparation, blending, and analysis. Strong understanding of relational databases and SQL. Experience in reverse engineering processes and workflows. Excellent communication skills for conveying technical information to non-technical stakeholders. Comfortable working directly with end users. Development experience in Python for data wrangling and automations. Strong problem-solving skills and attention to detail. Preferred Qualifications, Capabilities, and Skills: Experience in large-scale data handling. Working experience with Agile methodologies and understanding of the Scrum and Kanban boards Exposure to visualization tools (like Tableau, Qlik, Power BI, Looker etc Cloud certification is a plus Exposure to Scheduling tools like AirFlow, Control M is a plus
Posted 1 week ago
4.0 - 9.0 years
10 - 20 Lacs
bengaluru
Work from Office
Job Title: Data Science Engineer Experience Level: Mid-Senior (6+ years) About the Role: We are seeking a highly skilled and motivated Data Science Engineer with a strong background in building scalable data solutions and visualizations. The ideal candidate will have at least 6 years of experience working with Power BI, Azure Data Factory (ADF), Databricks, and Snowflake, along with hands-on experience in ETL processes, preferably involving SAP data sources. Key Responsibilities: Design, develop, and maintain Power BI dashboards and reports to support business decision-making. Build and optimize data pipelines using ADF, Databricks, and Snowflake. Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions. Perform data extraction, transformation, and loading (ETL) from various sources, including SAP. Ensure data quality, integrity, and governance across all platforms. Implement best practices for data engineering and analytics workflows. Monitor and troubleshoot data pipelines and reporting issues. Required Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, Engineering, or a related field. Minimum of 6 years of experience in data engineering and analytics. Proficiency in Power BI (DAX, data modeling, report design), Azure Data Factory, Databricks (PySpark or Scala), and Snowflake (SQL, performance tuning). Strong understanding of ETL processes and data integration techniques. Experience working with SAP data sources is a plus. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with CI/CD pipelines for data workflows. Familiarity with data governance and security best practices. Knowledge of machine learning or advanced analytics is a bonus. Flexible work arrangements.
Posted 1 week ago
6.0 - 8.0 years
4 - 7 Lacs
delhi, india
On-site
We are seeking a Tableau to Power BI Migration Specialist with 6 to 8 years of experience in business intelligence, specifically within the Insurance domain. The ideal candidate will have a strong background in migrating BI solutions from Tableau to Power BI, optimizing data models, and ensuring efficient reporting. You will be responsible for understanding complex business requirements, translating them into Power BI solutions, and ensuring the successful implementation of best practices in reporting and performance optimization. Required Skills and Experience: 6-8 years of experience working in business intelligence, with substantial experience in both Tableau and Power BI. Strong expertise in Power BI report development, including DAX, data modeling, and visualization best practices. Hands-on experience with SQL query performance tuning to optimize the data extraction process for Power BI. Proven ability to migrate Tableau dashboards to Power BI, with experience in data transformation and visual design adaptation. Expertise in Power BI Service, including report publishing, sharing, and managing data refresh schedules. Strong knowledge of the insurance industry and ability to translate complex business requirements into effective BI solutions. Experience with Power Query for data transformation and creating optimized queries for Power BI data models. Ability to design and implement dimensional models (star schema, snowflake schema) that support scalable and high-performance reporting. Experience in data governance and security frameworks for BI platforms, including managing user access and report permissions. Strong communication skills with the ability to work with both technical and non-technical stakeholders to gather requirements and provide ongoing support. Preferred Skills: Certification in Power BI (Microsoft Certified: Data Analyst Associate) or related Microsoft certifications. Experience with Power BI administration, including managing workspaces, dataflows, and managing collaboration between teams. Knowledge of Agile methodologies and experience working in a scrum team to deliver BI solutions. Experience with Tableau Server and understanding of report performance optimization on Tableau prior to migration. Familiarity with cloud-based Power BI solutions, including Azure integration and management of cloud-based data sources.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Data Engineer at Career Mantra, you will be responsible for designing, developing, and maintaining scalable data pipelines while optimizing data storage solutions. Your role will involve working with cloud platforms, big data technologies, and ETL processes to support business intelligence and analytics. Key Responsibilities: - Design and implement scalable data pipelines using Python, PySpark, and Big Data tools. - Optimize data processing performance in AWS. - Collaborate with business teams to deliver reliable data solutions. - Develop and maintain ETL processes using tools like Airflow, Azkaban, and AWS Glue. - Troubleshoot and ensure data integrity across data pipelines. - Mentor junior team members and promote best practices. Key Skills: - Strong experience with Python, PySpark, SQL, AWS (Redshift, Glue, S3). - Expertise in big data technologies (Hadoop, Hive, Spark). - Proficient in data modeling and ETL orchestration. - Experience with business intelligence tools (Tableau, Power BI). - Strong problem-solving and communication skills. Qualifications And Skills: - Extensive experience with data mapping for accurate data translation and transformation across systems. - Proficient in ETL processes for efficient extraction, transformation, and loading of data. - Strong knowledge of API Integration for smooth communication between different software solutions. - Expertise in System Integration to ensure harmonious operation of various IT systems and platforms. - Experience with Middleware Technologies to facilitate connectivity and collaboration between applications. - Competency in Scripting Languages for automating routine processes and enhancing system functions. - Proven Database Management skills to ensure data integrity and performance optimization. - Adept at Troubleshooting to identify, diagnose, and resolve technical issues promptly. Roles And Responsibilities: - Design and implement solutions for complex data integration projects to enhance operational efficiency. - Collaborate with cross-functional teams to ensure smooth integration and alignment of IT systems. - Develop and execute testing plans to verify the functionality and performance of integrated systems. - Monitor system performance and suggest necessary improvements for optimal operation and stability. - Serve as a technical expert, providing guidance and support to junior team members and stakeholders. - Maintain comprehensive documentation of integration processes, configurations, and best practices. - Stay current with technology trends and advancements to incorporate industry best practices. - Ensure compliance with company policies and standards in all integration activities.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
The Oracle Master Data Management (MDM) role is crucial for ensuring the accuracy, consistency, and effective management of an organization's master data. With the increasing reliance on data for strategic decision-making, a robust MDM framework is essential. As an Oracle MDM specialist, you will utilize Oracle MDM solutions to streamline data management practices, ensuring data integrity, compliance, and accessibility across various business units. Your role will involve enhancing data quality by implementing best practices for data governance and data lifecycle management. Collaboration with stakeholders such as IT, business units, and data stewards will be key in establishing and maintaining a central repository of master data. You will be responsible for identifying data issues, proposing solutions, and providing support for data ingestion and consolidation processes. Your contributions will be vital in enabling accurate reporting, compliance, and operational efficiencies to support organizational goals. Key responsibilities include designing and implementing MDM solutions using Oracle technologies, defining data requirements and quality standards with business units, developing and maintaining data models and metadata, monitoring and improving data quality through profiling and cleansing, facilitating data governance initiatives, integrating master data with other enterprise systems, managing data lifecycle processes, conducting data audits, training end-users and data stewards on MDM tools and best practices, documenting data standards, policies, and procedures, identifying opportunities for automation and process improvement, providing insights on data management strategies, participating in project planning and execution for MDM initiatives, resolving data-related issues, and ensuring compliance with data privacy and protection regulations. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 5 years of experience in Oracle MDM or similar data management roles. Proficiency in SQL, experience with Oracle databases, data modeling, data governance frameworks, data integration techniques, ETL processes, data quality tools, problem-solving skills, communication skills, project management, agile methodologies, analytical skills, reporting tools, data visualization techniques, knowledge of regulatory compliance issues, and certifications in Oracle MDM or related areas are advantageous. Additionally, you should be able to work collaboratively, possess proficiency in documentation and technical writing, and demonstrate a willingness to stay updated on industry trends in data management.,
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
saudi arabia
On-site
Description We are seeking a skilled Data Analyst to join our team in India. The ideal candidate will have 4-6 years of experience in data analysis, possessing a strong analytical mindset and the ability to translate data into actionable insights. This role requires collaboration with various teams to support data-driven decision-making. Responsibilities Collect and analyze data from various sources to identify trends and insights. Create and maintain dashboards and reports to visualize key performance indicators (KPIs). Collaborate with cross-functional teams to understand their data needs and deliver actionable insights. Develop and implement data models and algorithms to support decision-making processes. Conduct statistical analysis and interpret results to provide recommendations for business improvement. Skills and Qualifications Bachelor's degree in Data Science, Statistics, Computer Science, or a related field. 4-6 years of experience in data analysis or a related field. Proficiency in SQL for querying databases and data manipulation. Strong knowledge of data visualization tools such as Tableau, Power BI, or similar. Experience with programming languages such as Python or R for data analysis. Familiarity with statistical analysis techniques and methodologies. Excellent problem-solving skills and attention to detail. Ability to communicate complex data findings to non-technical stakeholders.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
Agoda is an online travel booking platform that offers accommodations, flights, and more to travelers worldwide. With a global network of 4.7M hotels and holiday properties, as well as flights and activities, Agoda, as part of Booking Holdings, employs over 7,100 individuals from 95+ nationalities across 27 markets. The work environment at Agoda is characterized by diversity, creativity, and collaboration, where innovation is fostered through a culture of experimentation and ownership, ultimately enhancing the customer experience of exploring the world. Bridging the World Through Travel is the core purpose of Agoda, as the company believes that travel enables people to enjoy, learn, and experience the world, bringing individuals and cultures closer together to foster empathy, understanding, and happiness. The dedicated team at Agoda, comprising skilled, driven, and diverse individuals from around the globe, shares a passion for making a positive impact on the travel industry. Leveraging innovative technologies and strong partnerships, Agoda aims to simplify and enrich the travel experience for all its customers. The Data department at Agoda is responsible for managing all data-related requirements within the company. The team's primary objective is to facilitate and enhance the utilization of data through creative approaches and the deployment of powerful resources such as operational and analytical databases, queue systems, BI tools, and data science technology. By recruiting top talent from diverse backgrounds, Agoda strives to tackle the challenge of leveraging data effectively while supporting personal growth and success within a culture of diversity and experimentation. The Data team's role at Agoda is crucial, as various stakeholders rely on their expertise to drive informed decision-making and enhance the customer search experience while ensuring protection against fraudulent activities. As a member of the Database Development team at Agoda, you will be involved in database design, data management, and database development, all integrated with automated Continuous Integration/Continuous Delivery (CI/CD) pipelines. Collaborating closely with product teams, your role will focus on providing high-quality database solutions, optimizing performance, and ensuring technical excellence to meet business requirements effectively. In this role, your key responsibilities will include assisting in designing database schema and architecture, delivering database SQL code with quality and performance optimization, collaborating with cross-functional teams, utilizing automated database CI/CD pipelines, and keeping abreast of advancements in database technology. To succeed in this position, you should have a minimum of 4 years of experience in database development, proficiency in platforms like Microsoft SQL and Oracle, a relevant degree in Computer Science or related field, advanced SQL query writing skills, and excellent communication abilities. Additionally, familiarity with CI/CD frameworks, Agile methodologies, NoSQL databases, and ETL processes would be advantageous. Agoda is an Equal Opportunity Employer, committed to maintaining a diverse and inclusive workplace. If you are interested in joining our team, please submit your application for consideration, and we will keep your details on file for future opportunities. For more information, please refer to our privacy policy. Please note that Agoda does not accept third party resumes, and any unsolicited resumes will not incur any fees.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate for this role will be responsible for collaborating with business users and stakeholders to define and analyze problems, and offer optimal technical solutions. You will be required to translate business requirements into technical specifications and design functional Business Intelligence (BI) solutions. Presenting architecture and solutions to executive-level personnel will also be part of your responsibilities. It is essential to adhere to industry best practices throughout the design and architecture phases of the solution. Ensuring the robustness and reliability of BI solutions during development, testing, and maintenance is crucial. Documentation of all aspects of the BI system for future upgrades and maintenance is a key task. You will also provide guidance to guarantee data governance, security, and compliance best practices in the architecture. The successful candidate should possess expertise in data modeling, including dimensional modeling, normalization/denormalization, and other data modeling techniques. Proficiency in extract, transform, and load (ETL) processes is required. Strong SQL coding skills and knowledge of database design principles are essential. Experience with any BI platform, preferably Power BI, is preferred. Knowledge of data warehousing concepts and tools, as well as experience with cloud platforms like AWS, Azure, and Google Cloud, is advantageous. Understanding data governance and data quality management, as well as proficiency in scripting languages like Python, R, or Java, are also required. Minimum qualifications for this role include 8+ years of experience in end-to-end design and architecture of enterprise-level data platforms and reporting/analytical solutions. Additionally, 5+ years of expertise in real-time and batch reporting, analytical solution architecture, and 4+ years of experience with PowerBI, Tableau, or similar technology solutions are necessary. Preferred qualifications include 8+ years of experience with dimensional modeling and data lake design methodologies, as well as experience with relational and non-relational databases like SQL Server and Cosmos. Strong communication and collaboration skills, along with creative problem-solving abilities, are essential. A Bachelor's degree in computer science or equivalent work experience is preferred. Experience with Agile/Scrum methodology, knowledge of the tax and accounting domain, and an Azure Data Engineer certification are advantageous. Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are an experienced Alteryx Developer with 5+ years of experience, seeking to join a dynamic data and analytics team at Teknikoz. Your primary responsibility will be to design, develop, and maintain data workflows using Alteryx to support business intelligence, reporting, and data governance initiatives. Your expertise in ETL processes, data preparation, and integration with BI tools such as Tableau, Power BI, or Qlik will be crucial for success in this role. Your key proficiency and responsibilities will include designing and developing scalable Alteryx workflows to automate data preparation, blending, and transformation processes. You will collaborate with business analysts, data engineers, and stakeholders to gather requirements and translate them into effective data solutions. Additionally, you will optimize workflows for performance and scalability, develop and maintain data documentation, integrate Alteryx workflows with visualization and reporting tools, perform data validation, troubleshoot and resolve issues, support deployment and scheduling of workflows, and contribute to data governance and process automation initiatives. To excel in this role, you should possess strong knowledge of data preparation, ETL concepts, and data warehousing. Proficiency in SQL and working with relational databases like SQL Server, Oracle, or Snowflake is essential. Experience with Alteryx Server and scheduling workflows, familiarity with BI tools such as Tableau, Power BI, or Qlik, and excellent analytical, problem-solving, and communication skills are also required. Preferred qualifications for this position include Alteryx Designer or Alteryx Advanced Certification, experience working in Agile/Scrum environments, exposure to cloud platforms like AWS, Azure, or GCP, and knowledge of Python or R for data manipulation within Alteryx. If you are a detail-oriented Alteryx Developer with a passion for data and analytics, this opportunity at Teknikoz may be the perfect fit for you.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a DBT professional, you will be responsible for designing, developing, and defining technical architecture for data pipelines and performance scaling in a big data environment. Your expertise in PL/SQL, including queries, procedures, and JOINs, will be crucial for the integration of Talend data and ensuring data quality. You will also be proficient in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts in Unix, Python, etc., to facilitate Extract, Load, and Transform operations. It would be advantageous to have hands-on experience and knowledge of Talend. Candidates with previous experience in PROD support will be given preference. Your role will involve working with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. You will be responsible for data analysis, troubleshooting data issues, and providing technical support to end-users. In this position, you will develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Your problem-solving skills will be put to the test, and you will be expected to have a continuous improvement approach. Possessing Talend/Snowflake Certification would be considered desirable. Excellent SQL coding skills, effective communication, and documentation skills are essential. Knowledge of the Agile delivery process is preferred. You must be analytical, creative, and self-motivated to excel in this role. Collaboration within a global team environment is key, necessitating excellent communication skills. Your contribution to Virtusa will be valued, as teamwork, quality of life, and professional development are the core values the company upholds. By joining a global team of 27,000 professionals, you will have access to exciting projects and opportunities to work with cutting-edge technologies throughout your career. Virtusa provides an environment that nurtures new ideas, fosters excellence, and encourages personal and professional growth.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As an Alteryx to Dataiku Migration Specialist at Novartis in Hyderabad, you will play a crucial role in our company's data strategy by leading large-scale migration projects from Alteryx to Dataiku. With over 10 years of experience in data analytics, ETL processes, and migration procedures, you will be responsible for ensuring a seamless transition to Dataiku while minimizing disruptions to business operations. Your key responsibilities will include planning, managing, and executing migration projects, developing comprehensive migration strategies, and proposing alternate target tools to the business with strong technical reasoning. You will leverage your technical expertise in Alteryx and Dataiku to address challenges during the migration process and optimize data workflows and ETL processes within Dataiku for improved performance post-migration. Collaboration will be essential as you work closely with cross-functional teams, provide training and support to staff on the new Dataiku platform, and maintain thorough documentation of migration processes. Your meticulous attention to detail, customer focus, and ability to remain resilient under pressure will be critical to the success of the migration projects. To be successful in this role, you should have a minimum of 10 years of experience in ETL/complex data processing, hands-on technical expertise in Dataiku and Alteryx, and strong coding skills. Previous experience in a similar role within a large organization and proficiency in Python would be desirable. Additionally, experience with Alteryx to Dataiku migration is a plus, along with the ability to understand Alteryx use cases and migrate them to Dataiku. Join Novartis in reimagining medicine to improve and extend people's lives. Be part of a diverse and inclusive work environment where your contributions drive us closer to our ambitions. If you are detail-oriented, customer-focused, and thrive in a fast-paced environment, consider joining our team to create a brighter future together. Learn more about Novartis's commitment to diversity and inclusion and explore career opportunities by joining the Novartis Network. Find out how you can thrive personally and professionally with our benefits and rewards. Join us in transforming healthcare and making a difference in patients" lives. (Note: This is a full-time regular position based in Hyderabad, India.),
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
In this role, you will have a significant impact on the design, development, and optimization of scalable data products in the Telecom Analytics domain. Collaborating with diverse teams, you will implement AI-driven analytics, autonomous operations, and programmable data solutions. This position provides an exciting opportunity to work with cutting-edge Big Data and Cloud technologies, enhance your data engineering skills, and contribute to the advancement of Nokia's data-driven telecom strategies. If you have a passion for creating innovative data solutions, mastering cloud and big data platforms, and thrive in a fast-paced, collaborative environment, then this role is tailored for you! You will play a crucial role in various aspects, including but not limited to: - Managing source data within the Metadata Hub and Data Catalog for effective Data Governance. - Developing and executing data processing graphs using Express It and the Co-Operating System for ETL Development. - Debugging and optimizing data processing graphs using the Graphical Development Environment (GDE) for ETL Optimization. - Leveraging Ab Initio APIs for metadata and graph artifact management for API Integration. - Implementing and maintaining CI/CD pipelines for metadata and graph deployments for CI/CD Implementation. - Mentoring team members and promoting best practices in Ab Initio development and deployment for Team Leadership & Mentorship. You should possess: - A Bachelor's or Master's degree in computer science, Data Engineering, or a related field with at least 8 years of experience in data engineering, focusing on Big Data, Cloud, and Telecom Analytics. - Hands-on expertise in Ab Initio for data cataloguing, metadata management, and lineage. - Skills in data warehousing, OLAP, and modeling using BigQuery, Clickhouse, and SQL. - Experience with data persistence technologies such as S3, HDFS, and Iceberg. - Proficiency in Python and scripting languages. Additional experience in the following areas would be beneficial: - Data exploration and visualization using Superset or BI tools. - Knowledge of ETL processes and streaming tools like Kafka. - Background in building data products for the telecom domain and understanding AI and machine learning pipeline integration. At Nokia, we are dedicated to driving innovation and technology leadership across mobile, fixed, and cloud networks. Join us to make a positive impact on people's lives and contribute to building a more productive, sustainable, and inclusive world. We foster an inclusive working environment where new ideas are welcomed, risks are encouraged, and authenticity is celebrated. Nokia offers continuous learning opportunities, well-being programs, support through employee resource groups, mentoring programs, and a diverse team with an inclusive culture where individuals can thrive and feel empowered. We are committed to inclusion and are proud to be an equal opportunity employer. Join our team at Nokia, the growth engine leading the transition to cloud-native software and as-a-service delivery models for communication service providers and enterprise customers. Be part of a collaborative team of dreamers, doers, and disruptors who push boundaries from the impossible to the possible.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
You have an exciting opportunity for the role of Senior Data Scientist - Analytics. You should possess a degree in mathematics, finance, statistics, engineering, computer science, or a related field. It is preferred to have experience in statistics, regression, time series, and econometrics. You should have at least 4 years of hands-on experience working with R and R-Shiny, along with a strong command over SQL and a genuine passion for open-source solutions. Additionally, a minimum of 2 years of experience in project management and team management is required. Your role will require a robust knowledge of Data Structures & Algorithms, APIs, ETL processes, testing methodologies, and cloud technologies, especially AWS. Experience in designing and building exploratory data analysis and data visualization is essential. Having a solution-oriented mindset and being self-directed are important qualities for this role. You should also have experience using version control software and project management solutions. It is crucial to align with the values of Passion, Integrity, Excellence, and Results as defined by Argus. If you find this opportunity appealing, we encourage you to share your resume with divya@thinkparms.in.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking a highly motivated and skilled Salesforce Data Cloud Lead to take charge of designing, developing, and optimizing Data Cloud data models and use cases. You will collaborate closely with cross-functional teams, including Marketing, IT, and Data Analytics, to deliver dynamic content and personalized experiences to our customers. Key Responsibilities: Lead the complete implementation of Salesforce Data Cloud (CDP), encompassing data acquisition, integration, quality assurance, and utilization. Configure and execute data-driven segmentation strategies to ensure precise audience targeting and content delivery. Design, document, and implement data models, data pipelines, and transformations to facilitate data ingestion, integration, and enrichment within Salesforce Data Cloud. Stay curious and stay updated with the fast-paced releases introduced to the platform. Work in collaboration with IT teams to guarantee seamless data integration, troubleshoot technical issues, and enhance system performance for data initiatives. Integrate data from diverse sources such as CRM systems, databases, and third-party platforms to bolster marketing and personalization efforts. Provide training and assistance to development teams in leveraging Salesforce Data Cloud features and functionalities. SFDC Skills: Customize and optimize the Data Cloud platform to align with business requirements. Integrate Data Cloud with: - Salesforce CRM - Salesforce MC - Salesforce Marketing Intelligence - Websites/microsites using SDK method (configuring connectors, sitemaps, schema, data streams) - Snowflake - Other sources Establish and manage data streams from various sources to ensure seamless data flow. Configure and develop criteria for Identity management, data transformations, and calculated insights. Set up lead scoring based on customer data from CRM and engagement data from different touchpoints, like websites and MC engagement, using data transformations and calculated insights. Configure data transformations for data lake objects. Create and maintain data models that elevate data quality and usability. Aid in forming customer segments and supporting marketing and analytics teams. Monitor the platform to promptly identify and resolve disruptions or errors. Qualifications: Bachelor's degree in Computer Science, Information Technology, Marketing, or a related field. Proven experience (2+ years) with a strong emphasis on Salesforce Data Cloud and/or custom database solutions. Preferably possess Salesforce Data Cloud Consultant certification. Thorough understanding of marketing automation, data segmentation, personalized customer journeys, decisioning, and Next Best Action. Proficiency in data integration and API utilization. Expertise in data modeling, ETL processes, data integration tools, and SQL. Hands-on experience with customer data platforms (CDPs) and data management practices, including data governance and compliance. Excellent problem-solving abilities and meticulous attention to detail, with adeptness in troubleshooting operational challenges. Familiarity with cloud technologies (e.g., AWS, Azure, GCP) and data modeling/scoring technologies. Strong communication and collaboration skills, enabling effective teamwork across cross-functional teams. Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The PL/SQL Developer position involves designing, developing, and maintaining Oracle database solutions to meet business requirements effectively. You will be responsible for creating efficient PL/SQL scripts, optimizing database performance, and collaborating with cross-functional teams to ensure seamless integration of database solutions. The ideal candidate for this role should possess strong problem-solving skills and a deep understanding of relational database concepts. Your key responsibilities will include developing, testing, and maintaining complex PL/SQL packages, procedures, functions, and triggers for data processing and ETL tasks. You will also design and implement database schemas and objects such as tables, indexes, and views. Analyzing business requirements and translating them into technical solutions using PL/SQL will be a crucial part of your role. Additionally, you will optimize SQL queries and database performance for high efficiency, perform data analysis to support report generation, and develop migration scripts for data transfer between systems. Ensuring compliance with security standards to protect sensitive data and providing technical support for production systems will also be part of your responsibilities. Documenting technical specifications and creating reusable code for scalability will be essential for this role. In terms of required skills, you should have proficiency in Oracle PL/SQL programming with experience in developing stored procedures, functions, and triggers. A strong understanding of relational database concepts (RDBMS) and performance tuning techniques is necessary. Experience with ETL processes, data warehouse integration, and knowledge of advanced PL/SQL features like collections, ref cursors, dynamic SQL, and materialized views is also required. Familiarity with tools like SQL Developer, Toad, or similar IDEs, and exposure to Unix/Linux scripting will be beneficial. Moreover, the role requires strong analytical and problem-solving abilities, excellent communication skills to interact with stakeholders and team members effectively, attention to detail with a focus on accuracy in coding and testing, and the ability to work both independently and in a team environment. Qualifications for this position include a Bachelor's degree in computer science, Information Technology, or a related field (or equivalent experience), proven experience of 3 to 4+ years in Oracle PL/SQL development, and certifications in Oracle Database technologies are a plus. This is a full-time position with a night shift schedule and performance bonus. The location is in Chennai, Tamil Nadu, and the work is in person. Immediate joiners are preferred.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As an Engineering Manager at Frequence, you will have the opportunity to lead the Insights and Analytics team to provide strategic direction and ensure alignment across teams. Your role will involve managing high-performance international individual contributors and overseeing the successful execution of complex, business-critical projects. Anticipating challenges, managing risks, and delivering impactful results will be essential to drive revenue and support the company's growth strategies. The team you will be leading is the driving force behind Madhive's Intelligence and Measurement, empowering customers with crystal-clear insights into their campaigns. Working collaboratively, you will identify and drive new strategic opportunities, weave AI into data and reporting solutions, and align technical efforts with business goals. Leading technical design and execution across projects, you will advocate for strong engineering practices and ensure that engineering outcomes have a measurable impact on company performance. To excel in this role, you should hold a Bachelors or Masters degree in Computer Science, Engineering, or related field, or possess equivalent work experience. With 7+ years of software development experience, including 3+ years in leadership roles, you should have a deep understanding of design patterns, software development methodologies, and distributed systems architecture. Proficiency in relevant technologies and tools such as Looker, Airflow, Spark, and streaming architectures is required. Experience in data pipelining, data warehousing, data modeling, and cloud infrastructure, preferably GCP, will be beneficial. Your ability to drive complex cross-functional projects, manage risks, and foster a culture of growth and accountability will be crucial. Being data-driven and using metrics to track team performance and make strategic decisions will also be key to success in this role. Frequence offers a dynamic, diverse, innovative, and friendly work environment where creativity is embraced, and differences are valued. As part of a trail-blazing team, you will have the opportunity to think big, make an impact, and contribute to the growth and success of the company. Please note that third-party recruiting agencies will not be considered for this search.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
Join GlobalLogic and become a valuable part of the team working on a significant software project for a world-class company providing M2M/IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your engagement with us will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, as well as performing analysis and estimations of customer requirements. You should have 8 to 10 years of relevant experience in data engineering with team leading experience. Your key skills should include expertise in modern data architectures, cloud-based solutions (preferably Azure), and ETL processes. Strong programming skills in SQL, Python, Spark, etc., along with hands-on experience with Azure storage, CDATA Databricks, and Pytran are required. Excellent communication and leadership skills will be essential for this role. As a part of your responsibilities, you will be expected to maintain, improve, and re-architect data pipelines to ensure scalability, reliability, and efficiency. Creating data pipelines for ingestion, cleaning, and processing data will be a key task. Collaboration with cross-functional teams to integrate new data sources, acting as a technical expert, leading and mentoring junior data engineers, and owning the end-to-end data strategy are crucial aspects of this role. You will also be responsible for implementing best practices for data governance, security, and compliance. At GlobalLogic, you will have the opportunity to work on exciting projects across industries like High-Tech, communication, media, healthcare, retail, and telecom. You will collaborate with a diverse team of highly talented individuals in an open and laidback environment, with opportunities for professional growth and development through training programs and certifications. We prioritize work-life balance by offering flexible work schedules, work-from-home options, paid time off, and holidays. We provide competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS (National Pension Scheme), health awareness programs, extended maternity leave, performance bonuses, and referral bonuses. Additionally, you can enjoy fun perks such as sports events, cultural activities, food subsidies, corporate parties, and discounts at popular stores and restaurants. GlobalLogic is a leader in digital engineering, helping brands worldwide design and build innovative products and digital experiences. With a focus on experience design, engineering, and data expertise, we accelerate our clients" transition into tomorrow's digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers globally, serving customers in various industries. As part of the Hitachi Group Company, we contribute to driving innovation through data and technology for a sustainable society with a higher quality of life.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |