Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 4.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Fusion Plus Solutions Inc is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Tech Stalwart Solution Private Limited is looking for Sr. Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
The role requires strategic thinking and planning and provide expertise throughout the entire product development life cycle with a strong knowledge of SAS Viya programming, API architecture , Kubernetes , Risk and Finance domain. Also the role requires ownership, making sure that quality is baked in from the start Key Responsibilities N/A Skills and Experience Hands on in Python - Data Manipulation- Pandas, Nunphy, SAS developer framework and expert in SAS development work Desirable skills on SAS admin Framework. Desirable skills on Hadoop development Framework. Sound statistical knowledge, analytical and problem-solving skills are desirable. Good to have knowledge on Big data technologies (Hortonworks HDP, Apache Hadoop, HDFS, Hive, Sqoop, Flume, Zookeeper and HBase, Oozie, Spark, Ni-Fi, Kafka, Snap logic, AWS, Red shift). Have experience with monitoring tools. Development capabilities using python, spark, sas, R languages. Good management and analytical skill Good writing and oral communication skills Good understanding of and experience in projects (e. g. SDLC, Agile methodology) Desirable skills in Big Data space (Hadoop Stack like HDFS, Pig, Hive, HBase, Sqoop etc) Ability to debug & write / modify Shell script/Python Willing to work on-call support over weekends Liaise with multiple application teams & co-ordinate for issue resolution Good analytical & interaction skills Qualifications N/A Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. 26272
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Streaming data Technical skills requirements :- Experience- 5+ Years Solid hands-on and Solution Architecting experience in Big-Data Technologies (AWS preferred) - Hands on experience in: AWS Dynamo DB, EKS, Kafka, Kinesis, Glue, EMR - Hands-on experience of programming language like Scala with Spark. - Good command and working experience on Hadoop Map Reduce, HDFS, Hive, HBase, and/or No-SQL Databases - Hands on working experience on any of the data engineering analytics platform (Hortonworks Cloudera MapR AWS), AWS preferred - Hands-on experience on Data Ingestion Apache Nifi, Apache Airflow, Sqoop, and Oozie - Hands on working experience of data processing at scale with event driven systems, message queues (Kafka Flink Spark Streaming) Hands-on development experience on above technologies - Data Warehouse exposure on Apache Nifi, Apache Airflow, Kylo - Operationalization of ML models on AWS (e.g. deployment, scheduling, model monitoring etc.) - Feature Engineering Data Processing to be used for Model development - Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) - Experience building data pipelines for structured unstructured, real-time batch, events synchronous asynchronous using MQ, Kafka, Steam processing - Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data - Must be very strong in writing SQL queries. Show more Show less
Posted 2 weeks ago
3.0 - 6.0 years
13 - 18 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s Platform Developmentteam designs, implements, tests and supports ZS’s ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do : Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality productsautomated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring : Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Understanding to Data Science Algorithms God to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 weeks ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Scala, Java, spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components, , SQL,PostgreSQL , t-sql/pl-sql, Hadoop ( airflow, oozie, hdfs, Sqoop, Hive, Pig, Map Reduce),Shell Scripting, Cloud technologies GCP preferable Mandatory Skill Sets Scala, Spark, GCP Preferred Skill Sets Scala, Spark, GCP Years Of Experience Required 4 - 8 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
6.0 - 11.0 years
18 - 25 Lacs
Hyderabad
Work from Office
SUMMARY Data Modeling Professional Location Hyderabad/Pune Experience: The ideal candidate should possess at least 6 years of relevant experience in data modeling with proficiency in SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Key Responsibilities: Develop and configure data pipelines across various platforms and technologies. Write complex SQL queries for data analysis on databases such as SQL Server, Oracle, and HIVE. Create solutions to support AI/ML models and generative AI. Work independently on specialized assignments within project deliverables. Provide solutions and tools to enhance engineering efficiencies. Design processes, systems, and operational models for end-to-end execution of data pipelines. Preferred Skills: Experience with GCP, particularly Airflow, Dataproc, and Big Query, is advantageous. Requirements Requirements: Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to deliver high-quality materials against tight deadlines. Effective under pressure with rapidly changing priorities. Note: The ability to communicate efficiently at a global level is paramount. --- Minimum 6 years of experience in data modeling with SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Proficiency in writing complex SQL queries for data analysis. Experience with GCP, particularly Airflow, Dataproc, and Big Query, is an advantage. Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to work effectively under pressure with rapidly changing priorities.
Posted 2 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description Job Title: AI Engineer Salary: 4 - 5.4 LPA Experience: Minimum 2 years Location: Hinjewadi, Pune Work Mode: Work from Office Availability: Immediate Joiner About Us: Rasta.AI, a product of AI Unika Technologies (P) Ltd, is a pioneering technology company based in Pune. We specialize in road infrastructure monitoring and maintenance using cutting-edge AI, computer vision, and 360-degree imaging. Our platform delivers real-time insights into road conditions to improve safety, efficiency, and sustainability. We collaborate with government agencies, private enterprises, and citizens to enhance road management through innovative tools and solutions. The Role This is a full-time, on-site role. As an AI Engineer, you will be responsible for developing innovative AI models and software solutions to address real-world challenges. You will collaborate with cross-functional teams to identify business opportunities and provide customized solutions. You will also work alongside talented engineers, designers, and data scientists to implement and maintain these models and solutions. Technical Skills Programming Languages: Python (and other AI-supported languages) Databases: SQL, Cassandra, MongoDB Python Libraries: NumPy, Pandas, Scikit-learn Deep Neural Networks: CNN, RNN, and LLM Data Analysis Libraries: TensorFlow, Pandas, NumPy, Scikit-learn, Matplotlib, Tensor Board Frameworks: Django, Flask, Pyramid, and Cherrypie Operating Systems: Ubuntu, Windows Tools: Jupyter Notebook, PyCharm IDE, Excel, Roboflow Big Data (Bonus): Hadoop (Hive, Sqoop, Flume), Kafka, Spark Code Repository Tools: Git, GitHub DevOps-AWS: Docker, Kubernetes, Instance hosting and management Analytical Skills Exploratory Data Analysis Predictive Modeling Text Mining Natural Language Processing Machine Learning Image Processing Object Detection Instance Segmentation Deep Learning DevOps AWS Knowledge Expertise Proficiency in TensorFlow library with RNN and CNN Familiarity with pre-trained models like VGG-16, ResNet-50, and Mobile Net Knowledge of Spark Core, Spark SQL, Spark Streaming, Cassandra, and Kafka Designing and Architecting Hadoop Applications Experience with chat-bot platforms (a bonus) Responsibilities The entire lifecycle of model development: Data Collection and Preprocessing Model Development Model Training Model Testing Model Validation Deployment and Maintenance Collaboration and Communication Qualifications Bachelor's or Master's degree in a relevant field (AI, Data Science, Computer Science, etc.) Minimum 2 years of experience developing and deploying AI-based software products Strong programming skills in Python (and potentially C++ or Java) Experience with machine learning libraries (TensorFlow, PyTorch, Kera's, scikit-learn) Experience with computer vision, natural language processing, or recommendation systems Experience with cloud computing platforms (Google Cloud, AWS) Problem-solving skills Excellent communication and presentation skills Experience with data infrastructure and tools (SQL, NoSQL, and big data platforms) Teamwork skills Join Us! If you are passionate about AI and want to contribute to groundbreaking projects in a dynamic startup environment, we encourage you to apply! Be part of our mission to drive technological advancement in India. Drop Your CV - hr@aiunika.com Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Preferred Education Master's Degree Required Technical And Professional Expertise Core Java, Spring Boot, Java2/EE, Microsservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred Technical And Professional Experience None Show more Show less
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Your typical day will involve collaborating with key stakeholders to define data standards, facilitating effective data collection, storage, access, and usage, and driving data stewardship initiatives for comprehensive and effective data governance. You will engage in discussions that shape the data landscape of the organization, ensuring that data practices align with established policies and standards. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of data governance frameworks and policies.- Monitor compliance with data governance policies and report on data quality metrics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data governance principles and best practices.- Experience with data quality assessment and improvement techniques.- Familiarity with data management tools and technologies.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to track progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Good To Have Skills: Experience with data integration and ETL processes.- Strong understanding of cloud computing concepts and services.- Familiarity with data warehousing solutions and best practices.- Experience in scripting languages such as Python or SQL. Additional Information:- The candidate should have minimum 5 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Airflow Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and functionality while adapting to any changes in project scope or requirements. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Airflow.- Good To Have Skills: Experience with cloud platforms such as AWS or Azure.- Strong understanding of workflow orchestration and scheduling.- Experience with data pipeline development and management.- Familiarity with containerization technologies like Docker. Additional Information:- The candidate should have minimum 7.5 years of experience in Apache Airflow.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking, PySpark Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Your role will also include troubleshooting data issues and optimizing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking, PySpark Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data processing workflow. Your role will be pivotal in enhancing the efficiency and reliability of data operations within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processing workflows to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
14 - 17 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 2 weeks ago
6.0 - 11.0 years
14 - 17 Lacs
Mysuru
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Your Role Experience in data engineering and end-to-end implementation of CDP projects. Proficient in SQL, CDP (TreasureData), Python/Dig-Dag, Presto/SQL, and data engineering. Hands-on experience with Treasure Data CDP implementation and management. Excellent SQL skills, including advanced query writing and optimization. Oversee the end-to-end maintenance and operation of the Treasure Data CDP. Familiarity with data integration, API operations, and audience segmentation. Your profile Experience in unifying data across multiple brands and regions, ensuring consistency and accuracy. Ability to create and manage data workflows in Treasure Data Collaborate with cross-functional teams to ensure successful data integration and usage. Troubleshoot and optimize data pipelines and processes for scalability and performance. Stay updated on the latest features and best practices in Treasure Data and related technologies.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Job Role Strong Spark programming experience with Java Good knowledge of SQL query writing and shell scripting Experience working in Agile mode Analyze, Design, develop, deploy and operate high-performant and high-quality services that serve users in a cloud environment. Good understanding of client eco system and expectations In charge of code reviews, integration process, test organization, quality of delivery Take part in development. Experienced into writing queries using SQL commands. Experienced with deploying and operating the codes in the cloud environment. Experienced in working without much supervision. Your Profile Primary Skill Java, Spark, SQL Secondary Skill/Good to have Hadoop or any cloud technology, Kafka, or BO. What youll love about working hereShort Description Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make changeto rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Noida
On-site
JOB DESCRIPTION Times Internet Times Internet is the largest digital products company and the digital venture of Times of India, India's largest media and entertainment group. TIL websites are among the fastest growing Web / Mobile based networks worldwide. Since its inception in 1999, Times Internet has led the Internet revolution in India and has emerged as India's foremost web entity, running diverse portals and niche websites. About the Business Unit: IT service provider to all the business within Times internet. Job Title: MySql Database Admin Location: NOIDA VERTICAL: Sys Admin DOMAIN: Technology EXPERIENCE: 5- 10 Years Role Summary: The Database Professional will participate in the design, implementation, automation, optimization and ongoing operational administration related tasks backend system running on MySQL/MariaDB/ MongoDB/RedisDB/Hadoop database infrastructure. Candidates should be ready to take the instant support/operational challenges of infrastructure platforms running database services. Job Specifications: Hands-on experience on MySQL/MariaDB RDBMS and related tools like Percona Xtrabackup and Percona-tool-kit etc. Hands-on experience on MongoDB NoSQL and Cache(RedisDB) datastores Hands-on experience working on Private as well Public IT Infrastructure clouds like AWS. Hands-on experience on the optimization of database SQL or No-SQL queries. Should have working experience on implementing best optimized and secure practices for RDS, NoSQL and Cache database stores Should have working experience on infra-resource planning, database upgrades/patching, backup/recovery and database troubleshooting. Should have working experience on all database support activities like Replication, Scalability, Availability, Performance tuning/optimizations on the database servers running large volumes of data Should have working experience on building highly scalable schema design for the application using No-SQL, RDBMS and Cache databases Good to have experience on Hadoop Technologies like HDFS, Hive, Flume, Sqoop, Sparke etc.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Preferred Education Master's Degree Required Technical And Professional Expertise Core Java, Spring Boot, Java2/EE, Microsservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred Technical And Professional Experience None Show more Show less
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
esired Experience Range: 4 to 8 Years Job Locations: Bangalore / Chennai / Hyderabad /pune/Bangalore/Kochi Required Skill Set: HDFS, Hive, Spark, Sqoop, Flume, Oozie, Unix Script, Autosys Must Have Skills: 1. Good understanding of Hadoop concepts including file system and Map Reduce. 2. Hands on experience on Spark framework, Unix scripting, Hive queries, wriring UDF in Hive. Theortical knowledge and POC alone will not suffice. 3. Good knowledge in Software Development Life Cycle and Project Development Lifecycle. 4. Associate should be able to work independently and should have strong debugging skill in both Hive and Spark. Associate should have experience developing large scale systems, experience debugging and performance tuning, excellent software design skills, communication skills and ability to work with client partners. Good to Have: 1. Experience in Banking and Finance Domain. 2. Experience in Agile Methodology 3. Knowledge in job scheduling tools like Autosys. 4. Knowledge in Kafka. Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Times Internet Times Internet is the largest digital products company and the digital venture of Times of India, India's largest media and entertainment group. TIL websites are among the fastest growing Web / Mobile based networks worldwide. Since its inception in 1999, Times Internet has led the Internet revolution in India and has emerged as India's foremost web entity, running diverse portals and niche websites. About The Business Unit IT service provider to all the business within Times internet. Job Title: MySql Database Admin Location: NOIDA VERTICAL: Sys Admin DOMAIN: Technology EXPERIENCE: 5- 10 Years Role Summary The Database Professional will participate in the design, implementation, automation, optimization and ongoing operational administration related tasks backend system running on MySQL/MariaDB/ MongoDB/RedisDB/Hadoop database infrastructure. Candidates should be ready to take the instant support/operational challenges of infrastructure platforms running database services. Job Specifications Hands-on experience on MySQL/MariaDB RDBMS and related tools like Percona Xtrabackup and Percona-tool-kit etc. Hands-on experience on MongoDB NoSQL and Cache(RedisDB) datastores Hands-on experience working on Private as well Public IT Infrastructure clouds like AWS. Hands-on experience on the optimization of database SQL or No-SQL queries. Should have working experience on implementing best optimized and secure practices for RDS, NoSQL and Cache database stores Should have working experience on infra-resource planning, database upgrades/patching, backup/recovery and database troubleshooting. Should have working experience on all database support activities like Replication, Scalability, Availability, Performance tuning/optimizations on the database servers running large volumes of data Should have working experience on building highly scalable schema design for the application using No-SQL, RDBMS and Cache databases Good to have experience on Hadoop Technologies like HDFS, Hive, Flume, Sqoop, Sparke etc. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Big Data Architect Skills Required 4 Years experience in Big data Architect Proficient in Spark, Scala, Hadoop MapReduce/HDFS, PIG, HIVE, AWS cloud computing Hands-on experience in tools like: EMR, EC2, Pentaho BI, Impala, ElasticSearch, Apache Kafka, Node.js, Redis, Logstash, statsD, Ganglia, Zeppelin, Hue, KETTLE Sound experience in Machine learning, Zookeeper, Bootstrap.js, Apache Flume, FluentD, Collectd, Sqoop, Presto, Tableau, R, GROK, MongoDB, Apache Storm, HBASE Hands-on experience in development - Core Java & Advanced Java Job Requirement: Bachelors degree in Computer Science, Information Technology, or MCA 4 Years of experience in Relevant Role Good analytical and problem solving ability Detail oriented with excellent written and verbal communication skills The ability to work independently as well as collaborating with a team. Experience: 10 Years Job Location: Pune/Hyderabad, India Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Big Data Architect Skills Required 4 Years experience in Big data Architect Proficient in Spark, Scala, Hadoop MapReduce/HDFS, PIG, HIVE, AWS cloud computing Hands-on experience in tools like: EMR, EC2, Pentaho BI, Impala, ElasticSearch, Apache Kafka, Node.js, Redis, Logstash, statsD, Ganglia, Zeppelin, Hue, KETTLE Sound experience in Machine learning, Zookeeper, Bootstrap.js, Apache Flume, FluentD, Collectd, Sqoop, Presto, Tableau, R, GROK, MongoDB, Apache Storm, HBASE Hands-on experience in development - Core Java & Advanced Java Job Requirement: Bachelors degree in Computer Science, Information Technology, or MCA 4 Years of experience in Relevant Role Good analytical and problem solving ability Detail oriented with excellent written and verbal communication skills The ability to work independently as well as collaborating with a team. Experience: 10 Years Job Location: Pune/Hyderabad, India Show more Show less
Posted 2 weeks ago
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Quadratyx We are a product-centric insight & automation services company globally. We help the world’s organizations make better & faster decisions using the power of insight & intelligent automation. We build and operationalize their next-gen strategy, through Big Data, Artificial Intelligence, Machine Learning, Unstructured Data Processing and Advanced Analytics. Quadratyx can boast of more extensive experience in data sciences & analytics than most other companies in India. We firmly believe in Excellence Everywhere. Data Engineer Job / Role Information Designation Data Engineer Function Technical Location: Hyderabad Job Description Purpose of the Job/ Role: The data engineer is expected to work on multiple projects driving user story analysis and elaboration, design and development of software applications, testing, and builds automation tools and architects Big Data analytics framework. Key Requisites Expertise in Data structures and algorithms Excellent knowledge in Python programming Knowledge of using job orchestration frameworks like Airflow, Oozie, Luigi, etc Has experience in distributed data and computing tools and strong ability to troubleshoot in these tools – SQOOP 2; HIVE; Spark/PySpark; HADOOP; MAPREDUCE Experience with relational SQL Knowledge of NoSQL databases like HBase, Cassandra etc Experience with source control tools such as GitHub and related dev process Communicate effectively, both verbally and written, with team members and business stakeholders. Working Relationships Reporting to Team Lead / Project Manager External Stakeholders Clients Skills/ Competencies Required Technical Skills Strong expertise (9 or more out of 10) in at least one modern programming language, like Python, Java Clear end-to-end experience in designing, programming, implementing large software systems Passion and analytical abilities to solve complex problems Soft Skills Always speaking your mind freely Communicating ideas clearly in talking and writing, integrity to never copy or plagiarize intellectual property of others Exercising discretion and independent judgment where needed in performing duties; not needing micro-management, maintaining high professional standards Academic Qualifications & Experience Required Required Educational Qualification & Relevant experience Bachelor’s or Master’s in Computer Science, Computer Engineering, or related discipline from a well-known institute. Minimum 3 - 6 years of work experience as a data engineer in an IT organization (preferably Analytics / Big Data/ Data Science / AI background). Quadratyx is an equal opportunity employer - we will never differentiate candidates on the basis of religion, caste, gender, language, disabilities or ethnic group. Quadratyx reserves the right to place/move any candidate to any company location, partner location or customer location globally, in the best interest of Quadratyx business. Interested candidate profiles should be emailed to jobs@quadratyx.com Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a rise in demand for professionals skilled in Sqoop, a tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases. Job seekers with expertise in Sqoop can explore various opportunities in the Indian job market.
The average salary range for Sqoop professionals in India varies based on experience levels: - Entry-level: Rs. 3-5 lakhs per annum - Mid-level: Rs. 6-10 lakhs per annum - Experienced: Rs. 12-20 lakhs per annum
Typically, a career in Sqoop progresses as follows: 1. Junior Developer 2. Sqoop Developer 3. Senior Developer 4. Tech Lead
In addition to expertise in Sqoop, professionals in this field are often expected to have knowledge of: - Apache Hadoop - SQL - Data warehousing concepts - ETL tools
As you explore job opportunities in the field of Sqoop in India, make sure to prepare thoroughly and showcase your skills confidently during interviews. Stay updated with the latest trends and advancements in Sqoop to enhance your career prospects. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2