Home
Jobs
Companies
Resume

8 Streamsets Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

Role Purpose Consultants are expected to complete specific tasks as part of a consulting project with minimal supervision. They will start to build a core areas of expertise and will contribute to client projects typically involving in-depth analysis, research, supporting solution development and being a successful communicator. The Consultant must achieve high personal billability. Do Consulting Execution An ambassador for the Wipro tenets and values Work stream leader or equivalent and coordinates small teams Receives great feedback from the client Client focused and tenacious in approach to solving client issues and achieving client objectives Organises work competently and ensures timeliness and quality of deliverables Has well grounded understanding of best practice in given area and industry knowledge, and can apply this under supervision Develops strong working relationships with team and client staff Business development Ensures high levels of individual utilisation achievement in line with the levels expected as part of the goal setting process Sells self by creating extensions to current assignments and demand on new assignments based on track record and reputation Understands Wipro's core service and consulting offering Builds relationships with client peers and provides required intelligence/insights to solve clients business problems Identifies sales leads and extension opportunities Anchors market research activities in chosen area of work Thought Leadership Develops insight into chosen industry and technology trends Contributes to team thought leadership Ensures a track record is written up of own assignment and, where appropriate, ensures it is written up as a case study Contribution to Practice/Wipro Continually delivers all Wipro admin in a timely manner (timesheets, appraisals expenses, etc.,) Demonstrates contribution to internal initiatives Contributes to the IP and knowledge management of Wipro and GCG and ensures its availability on the central knowledge management repository or Wipro and GCG Leverages tools, methods, assets, information sources, and IP available within the knowledge management platform Engages with other consulting and delivery teams to enhance collaboration and growth and is part of the Wipro 'Communities' activities Proactively participates in initiatives and suggests ideas for practice development Makes use of common methods and tools which are proven to work Develops process assets and other reusable artefacts based on learnings from projects Proactively participates in and suggests ideas for practice development initiatives Shares knowledge within the team and networks effectively with SMEs to bolster understanding and build skills Deliver Strategic Objectives Parameter Description Measure (Select relevant measures/ modify measures after speaking to your Manager) Deliver growth in consulting revenues Support business performance for direct consulting against relevant quarterly/annual targets Improve quality of consulting by flawless delivery of transformation engagements % of Personal Utilisation Achievement (against target) No. of RFI/RFPs responses supported No. of transformation engagements delivered No. of referenceable clients, testimonials Average CSAT, PCSAT across projects Generate Impact Enable pull through business/ impact for Wipro through front end consulting engagements/deal pursuit/ client relationships Number and value of downstream opportunities identified for GCG and larger Wipro Grow market positioning Lead/actively contribute to the development of thought leadership/offerings/assets for the practice to support business growth Eminence and thought leadership demonstrated through content, citations and testimonials Contributions to white papers/POVs/ assets such as Repeatable IP, Frameworks & Methods Number of ideas generated and active contribution to the development of new consulting offerings/solutions/assets Provide consulting leadership to accounts Support GCG Account Lead/Account team to grow consulting service portfolio Number & $ value of consulting deals in the account supported Grow the consulting talent Grow skills and capabilities to deliver consulting engagements in new industries, business themes, frameworks, technologies Self Development - Min 32 hrs on training in a year. Combination of online and classroom on new industries, new business themes, new technologies, new frameworks, etc. Build the consulting community Individual contribution to People Development and Collaboration Effectiveness Distinct participation in and demonstration of: Collaboration across GCG - through the contribution to cross-practice offerings, sharing of best practices/industrial/ technological expertise, consulting community initiatives Knowledge Management - Number of Assets owned and contributed to Consulting Central Mandatory Skills: StreamSets. .

Posted 6 days ago

Apply

5.0 - 9.0 years

15 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

About Us: Our global community of colleagues bring a diverse range of experiences and perspectives to our work. You'll find us working from a corporate office or plugging in from a home desk, listening to our customers and collaborating on solutions. Our products and solutions are vital to businesses of every size, scope and industry. And at the heart of our work youll find our core values: to be data inspired, relentlessly curious and inherently generous. Our values are the constant touchstone of our community; they guide our behavior and anchor our decisions. Designation: Software Engineer II Location: Hyderabad KEY RESPONSIBILITIES Design, build, and deploy new data pipelines within our Big Data Eco-Systems using Streamsets/Talend/Informatica BDM etc. Document new/existing pipelines, Datasets. Design ETL/ELT data pipelines using StreamSets, Informatica or any other ETL processing engine. Familiarity with Data Pipelines, Data Lakes and modern Data Warehousing practices (virtual data warehouse, push down analytics etc.) Expert level programming skills on Python Expert level programming skills on Spark Cloud Based Infrastructure: GCP Experience with one of the ETL Informatica, StreamSets in creation of complex parallel loads, Cluster Batch Execution and dependency creation using Jobs/Topologies/Workflows etc., Experience in SQL and conversion of SQL stored procedures into Informatica/StreamSets, Strong exposure working with web service origins/targets/processors/executors, XML/JSON Sources and Restful APIs. Strong exposure working with relation databases DB2, Oracle & SQL Server including complex SQL constructs and DDL generation. Exposure to Apache Airflow for scheduling jobs Strong knowledge of Big data Architecture (HDFS), Cluster installation, configuration, monitoring, cluster security, cluster resources management, maintenance, and performance tuning Create POCs to enable new workloads and technical capabilities on the Platform. Work with the platform and infrastructure engineers to implement these capabilities in production. Manage workloads and enable workload optimization including managing resource allocation and scheduling across multiple tenants to fulfill SLAs. Participate in planning activities, Data Science and perform activities to increase platform skills KEY Requirements Minimum 6 years of experience in ETL/ELT Technologies, preferably StreamSets/Informatica/Talend etc., Minimum of 6 years hands-on experience with Big Data technologies e.g. Hadoop, Spark, Hive. Minimum 3+ years of experience on Spark Minimum 3 years of experience in Cloud environments, preferably GCP Minimum of 2 years working in a Big Data service delivery (or equivalent) roles focusing on the following disciplines: Any experience with NoSQL and Graph databases Informatica or StreamSets Data integration (ETL/ELT) Exposure to role and attribute based access controls Hands on experience with managing solutions deployed in the Cloud, preferably on GCP Experience working in a Global company, working in a DevOps model is a plus Dun & Bradstreet is an Equal Opportunity Employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, creed, sex, age, national origin, citizenship status, disability status, sexual orientation, gender identity or expression, pregnancy, genetic information, protected military and veteran status, ancestry, marital status, medical condition (cancer and genetic characteristics) or any other characteristic protected by law. We are committed to Equal Employment Opportunity and providing reasonable accommodations to qualified candidates and employees. If you are interested in applying for employment with Dun & Bradstreet and need special assistance or an accommodation to use our website or to apply for a position, please send an e-mail with your requesttoacquisitiont@dnb.com Determinationon requests for reasonable accommodation are made on a case-by-case basis.

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

We are seeking an experienced and highly skilled IBM StreamSets Developer to design, develop, and optimize high-performance data pipelines. The ideal candidate will be an independent contributor with deep expertise in StreamSets Data Collector (SDC) and Transformer, capable of analyzing pipeline performance bottlenecks and implementing optimizations for scalability, reliability, and efficiency. Key Responsibilities: Design, develop, and deploy robust StreamSets data pipelines for batch and real-time data ingestion, transformation, and delivery. Analyze and troubleshoot pipeline performance bottlenecks (CPU, memory, I/O, latency) and implement optimizations. Fine-tune JVM settings, parallelism, partitioning, and batch sizes for optimal throughput. Implement best practices for error handling, data validation, and recovery mechanisms in pipelines. Optimize slow-running stages, memory-heavy transformations, and network latency issues. Work with Kafka, JDBC, REST APIs and other data sources/destinations. Monitor pipeline health using StreamSets Control Hub and set up alerts for failures. Collaborate with Cross functional teams, architects, and DevOps to ensure high-performance data flows. Document pipeline architecture, optimizations, and performance benchmarks. Required Skills & Experience: 3+ years of hands-on experience with IBM StreamSets (Data Collector & Transformer). Strong understanding of pipeline performance tuning (e.g., stage optimization, buffer tuning, cluster resource allocation). Proficiency in Java/Python/Groovy for custom scripting in StreamSets. Knowledge of SQL, NoSQL databases, and CDC (Change Data Capture) techniques. Ability to diagnose and resolve memory leaks, thread contention, and network bottlenecks. Familiarity with CI/CD for StreamSets pipelines (Git, Jenkins, Docker). Strong analytical skills to profile and benchmark pipeline performance. Nice to Have: StreamSets certification (e.g., StreamSets Engineer). Experience with Kubernetes for containerized StreamSets deployments. Knowledge of data observability tools (Datadog).

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Kolkata

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, develop and troubleshoot solutions/ designs/ models/ simulations on various softwares as per clients/ project requirements Do 1. Design and Develop solutions as per clients specifications Work on different softwares like CAD, CAE to develop appropriate models as per the project plan/ customer requirements Test the protype and designs produced on the softwares and check all the boundary conditions (impact analysis, stress analysis etc) Produce specifications and determine operational feasibility by integrating software components into a fully functional software system Create a prototype as per the engineering drawings & outline CAD model is prepared Perform failure effect mode analysis (FMEA) for any new requirements received from the client Provide optimized solutions to the client by running simulations in virtual environment Ensure software is updated with latest features to make it cost effective for the client Enhance applications/ solutions by identifying opportunities for improvement, making recommendations and designing and implementing systems Follow industry standard operating procedures for various processes and systems as per the client requirement while modeling a solution on the software 2. Provide customer support and problem solving from time to time Perform defect fixing raised by the client or software integration team while solving the tickets raised Develop software verification plans and quality assurance procedures for the customer Troubleshoot, debug and upgrade existing systems on time & with minimum latency and maximum efficiency Deploy programs and evaluate user feedback for adequate resolution with customer satisfaction Comply with project plans and industry standards 3. Ensure reporting & documentation for the client Ensure weekly, monthly status reports for the clients as per requirements Maintain documents and create a repository of all design changes, recommendations etc Maintain time-sheets for the clients Providing written knowledge transfer/ history of the project Deliver No. Performance Parameter Measure 1. Design and develop solutions Adherence to project plan/ schedule, 100% error free on boarding & implementation, throughput % 2. Quality & CSAT On-Time Delivery, minimum corrections, first time right, no major defects post production, 100% compliance of bi-directional traceability matrix, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: StreamSets. Experience5-8 Years.

Posted 3 weeks ago

Apply

7 - 11 years

50 - 60 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Role :- Resident Solution ArchitectLocation: RemoteThe Solution Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives Specific requirements for the role include: Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake Expert-level hands-on coding experience in Python, SQL ,Spark/Scala,Python or Pyspark In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib IoT/event-driven/microservices in the cloud- Experience with private and public cloud architectures, pros/cons, and migration considerations Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services Extensive hands-on experience with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala Able to build ingestion to ADLS and enable BI layer for Analytics with strong understanding of Data Modeling and defining conceptual logical and physical data models Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization Responsibilities : Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigationGuide customers in transforming big data projects,including development and deployment of big data and AI applications Promote, emphasize, and leverage big data solutions to deploy performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable Use a defense-in-depth approach in designing data solutions and AWS/Azure/GCP infrastructure Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling Aid developers to identify, design, and implement process improvements with automation tools to optimizing data delivery Implement processes and systems to monitor data quality and security, ensuring production data is accurate and available for key stakeholders and the business processes that depend on it Employ change management best practices to ensure that data remains readily accessible to the business Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs and experience with MDM using data governance solutions Qualifications : Overall experience of 12+ years in the IT field Hands-on experience designing and implementing multi-tenant solutions using Azure Databricks for data governance, data pipelines for near real-time data warehouse, and machine learning solutions Design and development experience with scalable and cost-effective Microsoft Azure/AWS/GCP data architecture and related solutions Experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies Bachelors or Masters degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience Good to have- - Advanced technical certifications: Azure Solutions Architect Expert, - AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics - AWS Certified Cloud Practitioner, Solutions Architect - Professional Google Cloud Certified Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

8 - 13 years

9 - 19 Lacs

Noida

Work from Office

Naukri logo

Job Description: Senior Data Engineer Position Overview: We are looking for a talented and experienced Senior Data Engineer to join our growing data team. In this role, you will work closely with data scientists, analysts, and other engineering teams to design, develop, and maintain robust data infrastructure and pipelines. Your work will ensure the availability of high-quality, reliable data to support business decision-making, reporting, and machine learning applications. Requiretment : An ETL Developer to design, develop, and maintain data integration solutions using Informatica tool, focusing on Extract, Transform, and Load (ETL) processes to ensure data quality and optimize performance. Designing and Developing ETL Processes: Involves creating and implementing ETL workflows to extract data from various sources, transform it, and load it into target systems. Data Integration: Ensuring data from different sources is integrated seamlessly and accurately. Data Warehousing: Designing and maintaining data warehouses to support business intelligence and analytics. Performance Tuning: Optimizing ETL processes for speed and efficiency. Troubleshooting and Debugging: Identifying and resolving issues within ETL processes. Collaboration: Working with business users, data analysts, and other developers to understand requirements and implement solutions. Documentation: Creating and maintaining technical documentation for ETL processes. Unit Testing: Performing unit, integration, and system testing on ETL processes, Ensuring data accuracy, consistency, and completeness. Willing to learn new cloud technologies as required and individual contributor to the team, should be able to work in tight deadlines. Should be able to communicate with stakeholders/Business for requirements. Need to work with multiple platform teams to arrange the infrastructure to setup/troubleshoot multiple environments. Should be able to handle end to end project work, Analysis to Prod deployment and supports . Please rush resumes to suprotim@baanyan.com For questions, please call at +91- 9038901659

Posted 2 months ago

Apply

5 - 8 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Key responsibilities Development Hands on, sleeves up development and delivery expected as a matter of course. Delivery Ensure project goals are achieved on time in alignment with the stakeholders’ expectation. Ability to work on complex projects and in a distributed environment. Escalate when necessary and in a timely manner. Work in close collaboration with other team members in the Enterprise Data & Analytics Platform team, to ensure Development/Delivery aspects are well represented in the project’s requirements and deliverables. Methodology Incorporate agile ways of working into the delivery process utilising DABL (Discovery, Alpha, Beta, Launch) Individuals will work as part of product-centric delivery team(s) that will focus on delivering value independently while fully embracing integrated DevOps approaches. Ownership Take ownership for the delivery/development projects and help steer until completion Governance Maintain governance that allows projects and stakeholders to manage overall project performance and manage programme risks within the global nature of some of the programmes. Forward looking: Remain flexible towards technology approaches to ensure we are taking advantage of new technologies. Keep abreast of industry developments in analytics and be able to interpret how these would impact?services and present new opportunities. Quality, Risk & Compliance: Ensure all risk and issues associated with owned projects are recorded and managed in the appropriate Risk & Issue logs in a timely manner. Ensure all Risks and Issues have clear action/mitigation/contingency plans defined, with named action owners and timelines for completion. Technical Architecture Be conversant with technical architecture to contribute to design discussions in partnership with the Delivery/Development Lead and dedicated Analytics & Data Architect. Qualifications and skills Essential MS/BS degree in Computer Science, Engineering, Data Science or equivalent experience, with preference on experience and proven track record. Ideal candidate would have an impressive hands-on work history in an advanced, recognized, and innovative environment. 5 to 8 Years Data engineering experience and seasoned coder in the relevant languages: Python, SQL, Scala, etc. Experience with the Azure data and analytics stack: Databricks, Data Factory, SQL DW, Cosmos DB, Power BI, Power Apps, etc. Experience integrating and supporting a variety of enterprise data tools: Ataccama, Talend, Collibra, Snowflake, StreamSets, etc. Fully conversant with big-data processing approaches and “schema-on-read” methodologies. Preference for deep understanding of Spark, Databricks and Delta Lake, and applying them to solve data science and machine learning business problems. Experience with Agile delivery frameworks and tools: SAFe, Jira, Confluence, Azure DevOps, etc. Experience with visualization tools and their application: developing reports, dashboards and KPI scorecards. Familiar deploying enterprise analytics solutions at scale with applicable services: administration, qualification, and user access provisioning. Experience articulating business value of analytics projects and progressing solutions from MVP to scaled-up production solutions. Ability to work in close partnership with groups across the IT organization (security, compliance, infrastructure, etc.) and business stakeholders in the commercial organizations. Ability to develop and maintain productive working relationships with suppliers and specialist technology providers to assemble and maintain a distinctive and flexible mix of capabilities against future requirements. Ideal candidate possesses great communication skills and the ability to communicate inherently complicated technical concepts to non-technical stakeholders of all levels.

Posted 2 months ago

Apply

7 - 10 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Skills : StreamSets with Unix Scripting, StreamSets Administration, 12. Experience with installing, patching, upgrading, administering, scripting and maintaining/Troubleshooting StreamSetsNotice Period: 0- 30 days

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies