Hi All , We have immediate openings for Below Requirement Role : Hadoop Administration Skill : Hadoop Administrator(with EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, Neo4j, AWS) Experience : 4 to 9yrs Work location : Hyderabad Interview Mode : 1sr round virtual & 2nd round F2F Notice Period : 15 to immediate joiners only Interested candidates can share your cv to Mail : sravani.vommi@sonata-software.com Contact : 7075751998 JD FOR Hadoop Admin: Hadoop Administrator(with EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, Neo4j, AWS) Job Summary: We are seeking a highly skilled Hadoop Administrator with hands-on experience managing distributed data platforms such as Hadoop EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, and Neo4j. Key Responsibilities: Cluster Management: Administer, manage, and maintain Hadoop EMR clusters, ensuring optimal performance, high availability, and resource utilization. Handle the provisioning, configuration, and scaling of Hadoop clusters, with a focus on EMR, ensuring seamless integration with other ecosystem tools (e.g., Spark, Kafka, HBase). Oversee HBase configurations, performance tuning, and integration within the Hadoop ecosystem. Manage OpenSearch(formerly known as Elasticsearch) for log analytics and large-scale search applications. Data Integration & Processing: Oversee the performance and optimization of Apache Spark workloads across distributed data environments. Design and manage efficient data pipelines between Snowflake, Kafka, and the Hadoop ecosystem, ensuring seamless data movement and transformation. Implement data storage solutions in Snowflake and manage seamless data transfers to/from Hadoop(EMR) and other environments. Cloud & AWS Services: Work closely with AWS services such as EC2, S3,ECS, Lambda, IAM, RDS, and CloudWatch to build scalable, cost-efficient solutions for data management and processing. manage AWS EMR clusters, ensuring they are optimized for big data workloads and integrated with other AWS services. - Security & Compliance: Manage and configure Kerberos authentication and access control mechanisms within the Hadoop ecosystem (HDFS, YARN, Spark) to ensure data security. Implement encryption and secure data transfer policies within Hadoop clusters, Kafka, HBase, and OpenSearch to meet compliance and regulatory requirements. Manage user roles and permissions for access to Snowflake and ensure seamless integration of security policies across platforms. Monitoring & Troubleshooting: Set up and manage monitoring solutions to ensure the health of the Hadoop ecosystem and related components. Actively monitor and troubleshoot issues with Spark, Kafka, HBase, OpenSearch, and other distributed systems. Provide proactive support to address performance issues, bottlenecks, and failures. Automation & Optimization: Automate the deployment, scaling, and management of Hadoop and other big data systems using scripting languages (Bash, Python) . Optimize the configurations and performance of EMR, Spark, Kafka, HBase, OpenSearch. Develop scripts and utilities for backup, job monitoring, and performance tuning.
Hi All , We have immediate openings for SAP Commerce Cloud (Hybris) for sonata software Experience : 5 to 8yrs Work location : Bangalore , hyderabad Interview mode : Virtual Notice period : 15 to immediate Note : SAP Commerce Cloud certification is a plus. Key Responsibilities: Design and develop robust SAP Commerce Cloud (Hybris) solutions. Implement customizations using Java, Spring Framework, and Spring Web Services. Develop and maintain Spartacus-based storefronts and ensure seamless integration with backend systems. Participate in code reviews, debugging, and optimization. Collaborate with UI/UX, QA, and DevOps teams to ensure timely delivery and high-quality output. Work closely with business stakeholders to gather and analyze requirements. Provide support and maintenance of existing e-commerce applications. Required Skills & Experience: 4+ years of experience in SAP Commerce Cloud (Hybris) development. Strong proficiency in Java , Spring , and Spring Web Services . Hands-on experience with Spartacus and frontend frameworks like Angular. Good understanding of RESTful and SOAP web services. Experience in building scalable and high-performance e-commerce solutions. Strong knowledge of B2B/B2C commerce flows and architecture. Familiarity with CI/CD tools and Agile methodologies. interested candidates can share your cv to -sravani.vommi@sonata-software.com Contact : 7075751998
Hi All , Greetings form Sonata Software!!! We have immediate openings for below requirement Role : Database Developer Skill set : Postgresql , Plsql Work Location : Hyderabad Experience : 5 to 7yrs Notice period : Only immediate joiner preffered Interview Mode :1st round Virtual & 2nd round F2F Mandatory interested candidates Contact : 7075751998 Mail : sravani.vommi@sonata-software.com Thanks & Regards Sravani IT Recruiter
Role & responsibilities Title: Talend Senior Developer Experience: 7-9Y Location: Hyderabad, (Work From office 5 days a week) Type of Hire: Full Time Overview A Talend Senior Developer is responsible for designing, implementing, and maintaining complex ETL (Extract, Transform, Load) processes and data integration solutions using Talend. This role involves advanced technical skills, leadership, and collaboration across functional and technical teams to deliver high-quality data solutions to support business needs. Key Responsibilities ETL Process Development Design, develop, and optimize ETL processes using Talend Studio and related tools to meet business requirements. Maintain, enhance, and troubleshoot existing Talend ETL workflows for data integrity and performance. Build and deploy Talend pipelines for integrating data from various sources including databases, files, APIs, and cloud platforms. Solution Architecture & Analysis Analyze business and technical requirements to design scalable data integration architectures. Lead design review sessions and translate business requirements into technical system flows, data flows, and mappings. Performance & Quality Implement robust error handling, validation, and data cleansing logic within Talend jobs. Optimize ETL processes for scalability, reliability, and performance, including parallelization and tuning of components. Serve as a technical subject matter expert and mentor to junior developers. Collaborate with cross-functional teams (business analysts, data architects, DevOps, DBAs, etc.) to deliver data solutions. Drive and support multiple projects simultaneously; contribute to project planning, estimation, and best-practices adoption. Documentation & Compliance Document ETL processes, technical specifications, and architecture diagrams for ongoing support and governance. Ensure solutions comply with organizational standards for security, quality, and data governance. System Operations & Support Monitor, support, and troubleshoot production ETL jobs and data pipelines. Schedule, deploy, and manage Talend jobs through management consoles (TAC, TMC). Handle ongoing maintenance, upgrades, and performance analysis of the Talend environment. Required Skills & Qualifications Skill Area Requirements Talend Expertise Proficiency with Talend Studio (including Open Studio, Cloud, Big Data, MDM, DI, DQ); advanced component knowledge (tMap, tJoin, orchestration modules). Snowflake SQL Strong experience with Snowflake SQL for writing, optimizing, and troubleshooting queries within Snowflake environments. Database Skills Strong SQL proficiency; experience with major RDBMS (e.g., Oracle, PostgreSQL, MySQL) and big data systems. Programming Experience with Unix/Linux shell scripting and basic Java or JavaScript for Talend customization. Performance Tuning Ability to optimize and scale ETL solutions; troubleshooting and root cause analysis. ETL Fundamentals Deep understanding of data integration, warehousing, data modeling, and system architecture. Leadership Ability to mentor, lead design reviews, and set best practices for the ETL team. DevOps & CICD Familiarity with source control (Git, SVN), CI/CD frameworks, and DevOps concepts for ETL deployments. Communication Excellent written and verbal communication, strong collaboration skills. Preferred candidate profile Education Bachelors degree in Computer Science, Engineering, or related field (preferred), plus 5+ years of professional ETL/Talend experience. Key Attributes Strong technical analysis and problem-solving skills. Attention to detail, self-motivation, and ability to multitask. Adaptable to changing project requirements and able to work independently or as part of a team. Commitment to continuous learning and staying updated with data integration and Talend best practices. Typical Job Duties Summary Lead analysis, design, implementation, and maintenance of large-scale data integration projects. Optimize, document, and support Talend-based ETL processes. Mentor junior team members and provide technical direction. Champion performance, data quality, and system reliability. Additions to Job Responsibilities Integrate Talend data pipelines with Snowflake, leveraging Snowflake SQL for ELT/ETL transformations, data quality checks, and performance optimization. Design and optimize complex queries, procedures, and data flows within Snowflake to support scalable analytics and reporting. Troubleshoot, tune, and monitor Snowflake-based jobs and queries for efficiency and accuracy. Collaboration & Leadership
FIND ON MAP