Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job description Primary skillsets 5 years hands on experience in Informatica PWC ETL development 7 years of experience in SQL analytical STAR schema data modeling and Informatica PowerCenter 5 years of Redshift Oracle or comparable database experience with BIDW deployments Secondary skillsets Good to know cloud like AWS Services Must have proven experience with STAR and SNOWFLAKE schema techniques Good to know cloud like AWS Services Proven track record as an ETL developer potentially to grow as an Architect leading development teams to deliver successful business intelligence solutions with complex data sources Strong analytical skills and enjoys solving complex technical problems Knowledge on additional ETL tools Qlik Replicate End to End understanding of data from ingestion to transformation to consumption in Analytics will be great benefits Informatica Power Center, Informatica (Etl Developer), Star Schema, Red Shift
Posted 1 month ago
4.0 - 9.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Lending team at Grab is dedicated to building safe, secure, and loan products catering to all user segments across SEA. Our mission is to promote financial inclusion and support underbanked partners across the region. Data plays a pivotal role in our lending operations, guiding decisions across credit assessment, collections, reporting, and beyond You will report to the Lead Data Engineer. This role is based in Bangalore. Get to Know the Role: As the Data engineer in the Lending Data Engineering team, you will work with data modellers, product analytics, product managers, software engineers and business stakeholders across the SEA in understanding the business and data requirements. You will build and manage the data asset, including acquisition, storage, processing and use channels, and using some of the most scalable and resilient open source big data technologies like Flink, Airflow, Spark, Kafka, Trino and more on cloud infrastructure. You are encouraged to think out of the box and have fun exploring the latest patterns and designs. The Critical Tasks You will Perform: Develop scalable, reliable ETL pipelines to ingest data from diverse sources. Build expertise in real-time data availability to support accurate real-time metric definitions. Implement data quality checks and governance best practices for data cleansing, assurance, and ETL operations. Use existing data platform tools to set up and manage pipelines. Improve data infrastructure performance to ensure, reliable insights for decision-making. Design next-gen data lifecycle management tools/frameworks for batch, real-time, API-based, and serverless use cases. Build solutions using AWS services like Glue, Redshift, Athena, Lambda, S3, Step Functions, EMR, and Kinesis. Use tools like Amazon MSK/Kinesis for real-time data processing and metric tracking. Read more Skills you need Essential Skills Youll Need: 4+ years of experience building scalable, secure, distributed, and data pipelines. Proficiency in Python, Scala, or Java for data engineering solutions. Knowledge of big data technologies like Flink, Spark, Trino, Airflow, Kafka, and AWS services (EMR, Glue, Redshift, Kinesis, and Athena). Solid experience with SQL, data modelling, and schema design. Hands-on with AWS storage and compute services (S3, DynamoDB, Athena, and Redshift Spectrum). Experience working with NoSQL, Columnar, and Relational databases. Curious and eager to explore new data technologies and solutions. Familiarity with in-house and AWS-native tools for efficient pipeline development. Design event-driven architectures using SNS, SQS, Lambda, or similar serverless technologies. Experience with data structures, algorithms, or ML concepts. Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave , and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What We Stand For at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Read more
Posted 1 month ago
10.0 - 14.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Netradyne harnesses the power of Computer Vision and Edge Computing to revolutionize the modern-day transportation ecosystem. We are a leader in fleet safety solutions. With growth exceeding 4x year over year, our solution is quickly being recognized as a significant disruptive technology. Our team is growing, and we need forward-thinking, uncompromising, competitive team members to continue to facilitate our growth. Job Title: Senior Staff Manager Location: Whitefield, Bangalore Department/Group: Cloud Group Experience: 10-14 Years Role Overview We are looking for a Senior Staff Manager to lead the development of cutting-edge SaaS products that tackle planet-scale challenges. This role involves working with real-time data from billions of miles and building systems that process hundreds of millions of IoT sensory inputs daily. Key Responsibilities Design, develop, and own SaaS products that solve large-scale, real-time data problems. Build highly scalable systems capable of processing massive volumes of IoT data. Translate product requirements into actionable technical tasks. Lead projects through the full software development lifecycle (SDLC). Develop efficient, cloud-hosted services and robust data pipelines. Essential Skills Strong programming expertise in Java and Spring Boot . Solid foundation in computer science fundamentals algorithms, data structures, OOP, and SQL. Good understanding of database internals and schema design for RDBMS and NoSQL systems. Proven experience in designing low-latency APIs for large datasets. Expertise in building scalable data processing pipelines . Hands-on experience with AWS and cloud-hosted environments. Strong analytical and debugging skills. Demonstrated ability to mentor and lead teams to deliver high-quality software. Familiarity with Agile methodologies and product development best practices. Qualifications B.Tech/M.Tech in Computer Science or a related field, or equivalent professional experience.
Posted 1 month ago
5.0 - 10.0 years
5 - 10 Lacs
Hyderabad
Work from Office
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. About FactSet FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. Your Teams Impact FactSet is seeking an Experienced software development engineering with proven proficiency in deployment of software adhering to best practices and with fluency in the development environment and with related tools, code libraries and systems. Responsible for the entire development process and collaborates to create a theoretical design. Demonstrated ability to critique code and production for improvement, as well as to receive and apply feedback effectively. Proven ability to maintain expected levels of productivity and increasingly becoming independent as a software developer, requiring less direct engagement and oversight on a day to day basis from one s manager. Focus is on developing applications, testing & maintaining software, and the implementation details of development ; increasing volume of work accomplished (with consistent quality, stability and adherence to best practices), along with gaining a mastery of the products to which one is contributing and beginning to participate in forward design discussions for how to improve based on one s observations of the code, systems and production involved. Software Developers provide project leadership and technical guidance along every stage of the software development life cycle. What Youll Do Work on the Data Lake/DAM platform handling millions of documents annually. Focus on developing new features while supporting and maintaining existing systems, ensuring the platforms continuous improvement. Participate in weekly On Call support to address urgent queries and issues in common communication channels, ensuring operational reliability and user satisfaction. Create comprehensive design documents for major architectural changes and facilitate peer reviews to ensure quality and alignment with best practices. Collaborate with product managers and key stakeholders to thoroughly understand requirements and propose strategic solutions, leveraging cross-functional insights. Actively participate in technical discussions with principal engineers and architects to support proposed design solutions, fostering a collaborative engineering environment. Work effectively as part of a geographically diverse team, coordinating with other departments and offices for seamless project progression What Were Looking For Bachelor s or master s degree in computer science, Engineering, or a related field is required. 5+ years of experience in software development, with a focus on Database systems handling & operations. Writing and optimizing complex SQL queries, stored procedures, views, triggers Developing and maintaining database schema and structures Creating ETL pipelines for data ingestion and transformation Troubleshooting data issues and performance bottlenecks Mentoring junior developers Proven experience working with APIs, ensuring robust connectivity and integration across the system. Working experience with AWS services such as Lambda, EC2, S3, and AWS Glue is beneficial for cloud-based operations and deployments. Strong analytical and problem-solving skills are critical for developing innovative solutions and optimizing existing platform components. Excellent collaborative and communication skills, enabling effective interaction with geographically diverse teams and key stakeholders. Capability to address system queries and provide weekly On Call support, ensuring system reliability and user satisfaction. Ability to prioritize and manage work effectively in a fast-paced environment, demonstrating self-direction and resourcefulness. Desired Skills: Deep RDBMS knowledge (e.g., SQL Server, Oracle, PostgreSQL) Strong T-SQL/PLSQL scripting Query tuning and performance optimization Data modelling and DWH concepts Often part of app development or analytics teams Stored procedures, functions, views, triggers Query optimization techniques Execution plan analysis Indexing strategies Partitioning and table optimization Logical and physical data modelling Normalization/denormalization Whats In It for You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . Ex US: At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law . Diversity: Return to Work: Returning from a break? Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
About the Role: Gartner is looking for passionate and motivated Lead Data Engineers who are excited to foray into new technologies and help build / maintain data driven components for realizing business needs. This role is in Gartner Product Delivery Organization (PDO). PDO Data Engineering teams are high velocity agile teams responsible for developing and maintaining components crucial to customer-facing channels, data science teams, and reporting & analysis. These components include but are not limited to Spark jobs, REST APIs, AI/ML model training & inference, MLOps / devops pipelines, data transformation & quality pipelines, data lake & data catalogs, data streams etc. What you will do : Ability to lead and execute a mix of small/medium sized projects simultaneously Owns success, takes responsibility for successful delivery of solutions from development to production. Mentor and guide team members Explore and create POC of new technologies / frameworks Should have significant experience working directly with Business users in problem solving Excellent Communication and Prioritization skills. Should be able to interact and coordinate well with other developers / teams to resolve operational issues. Should be self-motivated and a fast learner to ramp up quickly with a fair amount of help from team members. Must be able to estimate development tasks with high accuracy and deliver on time with high quality while following coding guidelines and best practices. Identify systemic operational issues and resolve them. What you will need 6+ years of post-college experience in data engineering, API development or related fields Must have Demonstrated experience in data engineering, data science, or machine learning. Experience working with data platforms - building and maintaining ETL flows and data stores for ML and reporting applications. Skills to transform data, prepare it for analysis, and analyze it - including structured and unstructured data Ability to transform business needs into technical solutions Demonstrated experience of cloud platforms (AWS, Azure, GCP, etc.) Experience with languages such as Python, Java, SQL Experience with tools such as Apache Spark, Databricks, AWS EMR Experience with Kanban or Agile Scrum development Experience with REST API development Experience collaboration tools such as Git, Jenkins, Jira, Confluence Experience with Data modeling and Database schema / table design #LI-SP7
Posted 1 month ago
5.0 - 10.0 years
11 - 15 Lacs
Kochi
Work from Office
Strada is a technology-enabled, people powered company committed to delivering world-class payroll, human capital management, and financial management solutions to organizations globally. . It s why we re so driven to connect passion with purpose. Our team s experience in human insights and cloud technology gives companies and employees around the world the ability to power confident decisions, for life. With a comprehensive total rewards package, continuing education and training, and tremendous potential with a growing global organization, Strada is the perfect place to put your passion to work. To learn more about us, visit stradaglobal.com Delivery Key responsibilities: Daily Collaborating with PSA s to track payroll progress according to the approved timeline. Evaluate priority issues with PSA agents on a daily basis and confirm proper follow-up. Handling client / TPV concerns and relationship building. Monthly Creating Service Review presentations (TPV countries) and conducting client calls. Check and authorize TPV invoices promptly, ensuring they comply with the SOW and that POs are correct and valid. Generate billing data for the client and ensure it is properly invoiced to the customer Guarantee precise and timely completion of the Client footprint and Margin file. Governance call with Parthers Support PSA during post payroll calls and correction ( For the countries with challenges ) Cross check BO entries are updated accuratley, LVMS miss comments to be reviewed. Adhoc Global Hypercare log preperation and leading of calls Conducting payroll walkthrough for newly onbaorded customers Access managemnt approvals for client specific Managing and driving remediation account processes Check process related documents are updated with latest information Inspect and endorse BRD documents. Conduct hypercare calls based on payroll timelines, ensuring alignment with projects. CR management - Moving CRs in sales force for client approval. Signing of the CR to TPVs post analysing the cost and effort ( Negotiating with Vendor on price if the cost doesn t seems reasonable) Review and approval of RCA before forwarding to the client. Managing security inscident process by aligning with SI team, partner ( If applicable ), CSL and client. Assisting in panel interviews. Taking lead on complex CRs ( Entity creation, descope etc) Cordination with commercial team , TPV and client for descoping activities of accounts. Quarterly access review management Take lead in client specific testings on new system releases and CRs. Audit support based on customer requirement Contract renewals and modification support Raising RR request based on project requirement Attend or facilitate internal meetings Cordinate with ITSCO for technical support. Annual Year End activities management. CPI indexation review for vendors Requirements Minimum 5 years experience with Payroll and HR operations in CLIENT facing situations 3 years Degree/Diploma Exceptional influencing skills, both internally and externally Effective at managing both up and down the organizational chain via communications through to execution. Change management experience Experience of working in a complex matrix structure Understanding of delivery models from global delivery centre perspective Highly flexible approach to working hours/ travel Take ownership and be responsible within the activities of the role Ability to identify and mitigate risks Strong knowledge of MS Office tools such as Excel, Word, and PowerPoint Flexibility to support a global and fast paced environment Attention to detail Excellent written and verbal skills
Posted 1 month ago
2.0 - 7.0 years
3 - 7 Lacs
Kochi
Work from Office
Create Annual Activity Planner and share with the client and TPV. Approve and Publish Final Version of Agreed Annual Payroll Calendar and system set up . Agree password format for the year Service Delivery Act as First point of escalation for payroll queries. Handle all the non-Payroll related tickets under the correct function . Mass upload, master data processing in hrX (only if applicable) Exchange event monitoring (only for hrX clients) Manager RCA - Arrange RCAs. validate quality Etc LVMS or BO reports to ensure all the ticket are close on time by TPV Responsible for the updating, maintaining, and enforcing of the Defined Work Instructions (DWIs) and CLIENT Solution workbook Responsible for the resolution of Technical/Functional issues escalated from the team, CLIENT and/or Partner and ensuring all system issues/defects are reported correctly and tickets are logged with the necessary details and evidence so Application Services and/or Products can investigate SLA Reporting Cross check the KPI with the real results and report to TPV to identify and correct any deviation Updating of SLA and fail reason in LVMS reported on monthly bases. Change Requests Check Client/Strada CSW/SOW for compliance Check Strada/TPV CSW/SOW for compliance Notify PSM on Change Requests raised Apply the CR process as per VPS 3.0 std. process Update CSW and get client s approval on the changes in Docs Escalations SPOC for TPV s First Escalation point for Clients. Include in RAG the escalations with PSM help Manage issues that need to be escalated - TPV related Security and Compliance Initiate SI process in case any SI detected by PSA Perform SOC1 Controls Hyper-care Participate in Hyper care calls Collaboration with Project Manager, PSM and OA team for Integration Support etc Supporting and Validating the test performed during pre-go live phase. (UAT/SIT testing and data mapping configuration, support in process definition) VPS process Walkthrough call with all the new CLIENTs during Hypercare Governance Manage regular Operations calls (Corrections call/Post-payroll call etc) Prepare post payroll Review Deck. Manages Operational Plan to track actions/issues. Manage issues that need to be escalated - TPV related Ensure adherence to all agreed schedules as per SOW for Client/TPV Collaborate with PSMs to ensure on the quality of services provided by the TPV provided to the client. Requirements 2+ years of client /vendor management experience in similar industries Experience in leading and handling client call 3 years Degree/Dipolma
Posted 1 month ago
8.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to execute the process and drive the performance of the team on the key metrices of the process. Job Details Country/Region: India Employment Type: Hybrid Work Type: Contract State: Karnataka City: Bengaluru Requirements Job Summary: We are seeking an experienced SAP MM Consultant with a strong background in SAP ECC 6.0 and expertise in Warehouse Management (WM). The ideal candidate will have in-depth experience in MM and WM configuration, at least one full lifecycle SAP implementation, and strong domain knowledge in subcontracting, consignment processes, MRP, and forecasting. Familiarity with Production Planning (PP) will be an added advantage. Key Responsibilities: Configure and customize SAP MM and WM modules based on business requirements. Manage and support existing SAP MM/WM environments, ensuring stable operations. Work on subcontracting and consignment processes, ensuring accurate configuration and business process mapping. Configure and support below functionalities within MM. o Master data: Material Master, Vendor Master, Purchasing Info Records, Source List, Quota Arrangement o Procurement process: Purchase Requisition (PR), Purchase Order (PO), Contracts, Scheduling Agreements o Release procedures: Workflow and approval process for PRs and POs (both with and without classification) o Inventory management: Goods Receipt, Goods Issue, Transfer Postings, Reservations, Consignment, Subcontracting o Valuation and account determination: Integration with FI for automatic postings o Pricing procedures: Condition records, access sequences, schema determination for purchasing o Invoice verification: MIRO, MRBR, GR/IR clearing o Batch management and serial number management (if applicable) o Interfacing with barcode systems Map end-to-end warehouse business processes into SAP WM, including inbound, outbound, putaway, picking, packing, stock transfer, and physical inventory. Configure and support the following SAP WM components: o Storage types, sections, and bins o Putaway and picking strategies o Transfer Orders (TO), Transfer Requirements (TR), and Posting Change Notices o Inventory Management integration o Batch Management and Serial Number tracking o Replenishment and stock removal strategies o Interfacing with barcode systems Strong understanding of SAP MRP logic, planning strategies (make-to-stock, make-to- order), lot-sizing procedures, and procurement types. Conduct MRP planning runs and analyze results for accuracy and optimization Ability to configure MRP parameters, procurement types, and inventory controls Collaborate with cross-functional teams, including PP, SD, and FI, for integrated solutions. Create functional specifications, test scripts, and training documentation. Troubleshoot and resolve production issues in MM/WM/PP areas. Provide end-user training and support as required. Required Qualifications: 8-10 years of hands-on experience in SAP MM and WM modules within SAP ECC 6.0. Strong knowledge and experience in: o MM and WM configuration o Material Requirements Planning (MRP) o Forecasting tools and configuration o Process Optimization Excellent understanding of business processes in Procurement and warehouse management. In depth knowledge in Master Data Management, Inventory Management, Material Movements Strong analytical and problem-solving skills. Effective communication and stakeholder management abilities. Preferred Qualifications: Experience or exposure to SAP PP (Production Planning). Experience with integration points between MM, PP, SD, and FI.
Posted 1 month ago
2.0 - 7.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 month ago
4.0 - 9.0 years
6 - 7 Lacs
Hyderabad
Work from Office
Hiring for Kafka Administrator -- Hyderabad Job Summary: We are seeking an experienced Kafka Administrator to manage and maintain our Apache Kafka ecosystem. This role is critical in ensuring the high availability, performance, and security of our messaging and streaming platform used for real-time data pipelines and event-driven applications. Key Responsibilities: Install, configure, and manage Apache Kafka clusters in production and non-production environments. Monitor Kafka infrastructure using tools like Prometheus, Grafana, or Confluent Control Center. Perform upgrades, patching, tuning, and troubleshooting of Kafka brokers, ZooKeeper, Connect, and other ecosystem components. Implement security best practices, including SSL, SASL, and RBAC in Kafka. Ensure high availability and disaster recovery strategies for Kafka. Set up Kafka Connectors and Streams to integrate with external systems. Collaborate with application developers and data engineers to optimize Kafka usage. Automate routine tasks using shell scripts, Python, or Ansible. Maintain Kafka documentation including topology, policies, and configurations. Participate in on-call support rotation and incident response. Required Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 4 years of hands-on experience with Kafka administration. Strong understanding of Kafka architecture, internals, and ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Experience with Linux system administration. Familiarity with containerization (Docker, Kubernetes) and CI/CD pipelines. Experience with monitoring and alerting systems. Scripting experience (Bash, Python, etc.). Understanding of networking, security protocols, and access control in distributed systems.
Posted 1 month ago
3.0 - 8.0 years
45 - 50 Lacs
Hyderabad
Work from Office
As a Data Engineer you will be working on building and maintaining complex data pipelines, assemble large and complex datasets to generate business insights and to enable data driven decision making and support the rapidly growing and dynamic business demand for data. You will have an opportunity to collaborate and work with various teams of Business analysts, Managers, Software Dev Engineers, and Data Engineers to determine how best to design, implement and support solutions. You will be challenged and provided with tremendous growth opportunity in a customer facing, fast paced, agile environment. Design, implement and support an analytical data platform solutions for data driven decisions and insights Design data schema and operate internal data warehouses SQL/NOSQL database systems Work on different data model designs, architecture, implementation, discussions and optimizations Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS big data technologies like EMR, RedShift, Elastic Search etc. Work on different AWS technologies such as S3, RedShift, Lambda, Glue, etc.. and Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Work on data lake platform and different components in the data lake such as Hadoop, Amazon S3 etc. Work on SQL technologies on Hadoop such as Spark, Hive, Impala etc.. Help continually improve ongoing analysis processes, optimizing or simplifying self-service support for customers Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Enjoy working closely with your peers in a group of talented engineers and gain knowledge. Be enthusiastic about building deep domain knowledge on various Amazon s business domains. Own the development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 1 month ago
2.0 - 11.0 years
25 - 30 Lacs
Noida
Work from Office
Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. BTech/MCA
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions and ensure applications align with business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular knowledge sharing sessions- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll- Strong understanding of SAP HR modules- Experience in SAP Payroll configuration and customization- Knowledge of SAP Payroll schemas and rules- Hands-on experience in SAP Payroll processing- Experience in SAP Payroll reporting Additional Information:- The candidate should have a minimum of 5 years of experience in SAP HCM Payroll- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
8.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Title: SAP MM | WM Consultant (ECC 6.0 ) Experience: 8 - 10 Years Location: Bangalore (Hybrid) | Remote Employment Type: Contract Job Summary: We are seeking an experienced SAP MM Consultant with a strong background in SAP ECC 6.0 and expertise in Warehouse Management (WM). The ideal candidate will have in-depth experience in MM and WM configuration, at least one full lifecycle SAP implementation, and strong domain knowledge in subcontracting, consignment processes, MRP, and forecasting. Familiarity with Production Planning (PP) will be an added advantage. Key Responsibilities: Configure and customize SAP MM and WM modules based on business requirements. Manage and support existing SAP MM/WM environments, ensuring stable operations. Work on subcontracting and consignment processes, ensuring accurate configuration and business process mapping. Configure and support below functionalities within MM. o Master data: Material Master, Vendor Master, Purchasing Info Records, Source List, Quota Arrangement o Procurement process: Purchase Requisition (PR), Purchase Order (PO), Contracts, Scheduling Agreements o Release procedures: Workflow and approval process for PRs and POs (both with and without classification) o Inventory management: Goods Receipt, Goods Issue, Transfer Postings, Reservations, Consignment, Subcontracting o Valuation and account determination: Integration with FI for automatic postings o Pricing procedures: Condition records, access sequences, schema determination for purchasing o Invoice verification: MIRO, MRBR, GR/IR clearing o Batch management and serial number management (if applicable) o Interfacing with barcode systems Map end-to-end warehouse business processes into SAP WM, including inbound, outbound, putaway, picking, packing, stock transfer, and physical inventory. Configure and support the following SAP WM components: o Storage types, sections, and bins o Putaway and picking strategies o Transfer Orders (TO), Transfer Requirements (TR), and Posting Change Notices o Inventory Management integration o Batch Management and Serial Number tracking o Replenishment and stock removal strategies o Interfacing with barcode systems Strong understanding of SAP MRP logic, planning strategies (make-to-stock, make-toorder), lot-sizing procedures, and procurement types. Conduct MRP planning runs and analyze results for accuracy and optimization Ability to configure MRP parameters, procurement types, and inventory controls Collaborate with cross-functional teams, including PP, SD, and FI, for integrated solutions. Create functional specifications, test scripts, and training documentation. Troubleshoot and resolve production issues in MM/WM/PP areas. Provide end-user training and support as required. Required Qualifications: 8-10 years of hands-on experience in SAP MM and WM modules within SAP ECC 6.0. Strong knowledge and experience in: o MM and WM configuration o Material Requirements Planning (MRP) o Forecasting tools and configuration o Process Optimization Excellent understanding of business processes in Procurement and warehouse management. In depth knowledge in Master Data Management, Inventory Management, Material Movements Strong analytical and problem-solving skills. Effective communication and stakeholder management abilities. Preferred Qualifications: Experience or exposure to SAP PP (Production Planning). Experience with integration points between MM, PP, SD, and FI.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Lalitpur
Work from Office
We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What you ll be doing: Work in a small scrum team to design and build high-quality customer-facing software. Write unit, functional and end-to-end tests using mocha, chai, sinon, karateJS & codeceptJS. Gain product knowledge by successfully developing features for our applications. Communicate effectively with stakeholders, peers, and others. Strive for continuous improvement in customer experience, tools & processes. What we re looking for: 5+ years of development experience developing complex, scalable web-based Applications using NodeJs. Minimum 2+ years experience with ReactJS. Hands-on experience in designing and defining database schema using RDBMS(MySql) and NoSQL databases. Experience on rabbitmq, redis, kibana, Algolia are plus points. Should have experience on writing scripts and do ETL jobs. Possess good problem-solving and analytical skills. Experienced in test-driven development(TDD). Experience with web services, REST API, GraphQL and Microservice. Experience with Shopify is preferred. Experience with Amazon AWS services, Docker. Experience with GIT and the Features branching workflow. Experienced in continuous integration, and continuous delivery(CICD) will be a plus. Experience in working in an Agile development environment will be a plus. Awesome written and oral communication skills and ability to work in a global and distributed environment with agility to mold communication for different audiences. Experience in the eCommerce/Digital Commerce domain with high volume transactions will be a plus. You will have an opportunity to: COLLABORATE with global teams to build scalable web-based applications PARTNER closely with the engineering team to follow best practices and standards. PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications ACHIEVE engineering excellence by implementing standard practices and standards PERFORM technical root causes analysis and outlines corrective action for given problems
Posted 1 month ago
3.0 - 5.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Generative SEO Specialist at Practical DevSecOps | Jobs at Practical DevSecOps 6-8 Lakhs per Annum Job Description Job Title: Generative SEO Specialist Experience:3-5 Years The Role: We are hiring a creative, analytical, and resourceful marketing strategist to pioneer our presence on the new frontier of search. In this role, you will be responsible for developing and executing a sophisticated strategy that bridges best-in-class SEO with the emerging world of AI-driven platforms like ChatGPT, Google Gemini, and Perplexity. This is a pivotal role for a quick thinker with 3-4+ years of experience who is passionate about reverse-engineering how Large Language Models (LLMs) discover, interpret, and recommend content. You will not just chase rankings; you will architect our brands authority and visibility for a future where search is conversational. Your goal is to ensure our brand wins the high-intent moments, becoming a cited and trusted source in AI-generated answers, directly impacting business growth and revenue. Key Responsibilities: 1. Visibility in LLMs & AI Search Engines (AEO/LLM-O) Develop and execute innovative strategies to gain and scale our visibility within AI-powered search and answer engines. Systematically reverse-engineer how LLMs surface sources, then identify and implement actionable tactics to influence our inclusion and prominence. Define the framework for LLM visibility tracking, establishing new KPIs, OKRs, and reporting dashboards to measure performance, influence, and ROI where no standard playbook exists. 2. Analytical & Semantic Content Strategy (SEO/SXO) Develop data-driven hypotheses on what LLMs prioritize and architect a semantic content strategy that aligns our website structure, content clusters, and internal linking with AI logic. Lead the creation of rich, engaging content assets including interactive tools, lead magnets, videos, and white papers that boost user engagement and establish our relevance with AI. Implement and manage a sophisticated structured data and schema markup strategy (FAQPage, Article, Event, etc.) to ensure our content is easily interpreted and featured by AI models. 3. Cross-Functional Collaboration & Influence Partner closely with product and engineering teams to define technical requirements for surfacing the data and signals consumed by LLMs. Collaborate with social media and brand teams to understand and leverage how social signals, brand authority, and user-generated content may influence AI rankings. Work with content and design teams to ideate and launch high-value assets that make our brand indispensable to both users and AI tools. 4. Performance, Geo-Targeting & Core SEO Own and report on LLM-specific performance KPIs, ensuring they align with core SEO metrics and broader business objectives, especially lead generation and revenue impact. Implement and optimize Geo-Targeting (GEO) strategies to capture location-based conversational prompts within AI tools. Support broader SEO initiatives, including page optimization, CTR improvements, and behavioral metric analysis to maintain a strong foundational search presence. What You Need to Succeed: Experience: 4+ years of hands-on SEO experience, with demonstrable exposure to content marketing, growth experimentation, or AI-driven projects. AI & LLM Curiosity: A deep and tangible fascination with how LLMs and AI search tools function. You are constantly experimenting and learning how to reverse-engineer these opaque systems. Sharp Analytical Skills: The ability to think critically, design KPIs and tracking systems for emerging platforms, and translate data into actionable insights. Creative & Resourceful Mindset: You are a smart, quick-thinking problem-solver who can ideate novel content formats and strategies that appeal to both human users and AI alike. Business Savvy: You have a user-centric and commercially-minded approach, thinking in terms of customer experience, value, and revenue, not just abstract rankings. Excellent Collaborator: Clear and concise communication skills with a proven track record of working effectively across technical, creative, and strategic teams. Bonus Experience: Hands-on experience with advanced structured data, programmatic content creation, or video content strategy is a significant plus.
Posted 1 month ago
5.0 - 9.0 years
16 - 20 Lacs
New Delhi, Bengaluru
Work from Office
What is Findem: Findem is the only talent data platform that combines 3D data with AI. It automates and consolidates top-of-funnel activities across your entire talent ecosystem, bringing together sourcing, CRM, and analytics into one place. Only 3D data connects people and company data over time - making an individual s entire career instantly accessible in a single click, removing the guesswork, and unlocking insights about the market and your competition no one else can. Powered by 3D data, Findem s automated workflows across the talent lifecycle are the ultimate competitive advantage. Enabling talent teams to deliver continuous pipelines of top, diverse candidates while creating better talent experiences, Findem transforms the way companies plan, hire, and manage talent. Learn more at www.findem.ai Experience - 5 - 9 years We are looking for an experienced Big Data Engineer, who will be responsible for building, deploying and managing various data pipelines, data lake and Big data processing solutions using Big data and ETL technologies. Location- Delhi, India Hybrid- 3 days onsite Responsibilities Build data pipelines, Big data processing solutions and data lake infrastructure using various Big data and ETL technologies Assemble and process large, complex data sets that meet functional non-functional business requirements ETL from a wide variety of sources like MongoDB, S3, Server-to-Server, Kafka etc., and processing using SQL and big data technologies Build analytical tools to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics Build interactive and ad-hoc query self-serve tools for analytics use cases Build data models and data schema for performance, scalability and functional requirement perspective Build processes supporting data transformation, metadata, dependency and workflow management Research, experiment and prototype new tools/technologies and make them successful Skill Requirements Must have-Strong in Python/Scala Must have experience in Big data technologies like Spark, Hadoop, Athena / Presto, Redshift, Kafka etc Experience in various file formats like parquet, JSON, Avro, orc etc Experience in workflow management tools like airflow Experience with batch processing, streaming and message queues Any of visualization tools like Redash, Tableau, Kibana etc Experience in working with structured and unstructured data sets Strong problem solving skills Good to have Exposure to NoSQL like MongoDB Exposure to Cloud platforms like AWS, GCP, etc Exposure to Microservices architecture Exposure to Machine learning techniques The role is full-time and comes with full benefits. We are globally headquartered in the San Francisco Bay Area with our India headquarters in Bengaluru. Equal Opportunity
Posted 1 month ago
1.0 - 2.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Huron is redefining what a global consulting organization can be. Advancing new ideas every day to build even stronger clients, individuals and communities. We re helping our clients find new ways to drive growth, enhance business performance and sustain leadership in the markets they serve. And, we re developing strategies and implementing solutions that enable the transformative change they need to own their future. As a member of the Huron corporate team, you ll help to evolve our business model to stay ahead of market forces, industry trends and client needs. Our accounting, finance, human resources, IT, legal, marketing and facilities management professionals work collaboratively to support Huron s collective strategies and enable real transformation to produce sustainable business results. Join our team and create your future. Huron s Corporate Workday team is comprised of business-minded technology professionals responsible for the ongoing optimizing of our portfolio through product strategy, solution delivery, and support. Our team partners closely with our business stakeholders to identify challenges and opportunities to drive efficiencies and create real outcomes for our business. The Product portfolios focus primarily on Workday but also contain integrations, bots, and other complementary solutions. We partner closely with our client-facing counterparts to share best practices and ensure Huron is at the cutting edge of Workday capabilities. The Workday HCM Core Developer will be primarily responsible for analysis, design, and configuration across the Core HCM, Benefits, and Compensation modules within the Workday platform. Requirements Minimum of 1-2 years in configuring and supporting Workday HCM modules such as Core HCM, Compensation, and Benefits. Experience with Workday Reporting, Calculated Fields, EIB builds, schema, and Excel data analysis. Ability to translate business requirements into technical solutions and communicate effectively with stakeholders. Demonstrated ability to work with global HR teams and internal stakeholders to implement system enhancements. Strong analytical skills to troubleshoot and resolve system issues independently. Experience in creating and maintaining documentation for Workday business processes and technical specifications. Preferences Workday HCM Certification is preferred; familiarity with non-HCM modules and Workday Security is a plus. Experience with Agile development processes, including PI Planning and Sprint Reviews. Experience working with global teams and understanding regional HR requirements. Ability to mentor and guide junior team members. Proactive in identifying and implementing system improvements. Strong understanding of data security and compliance standards. Position Level Senior Analyst Country India
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating solutions that align with business needs and enhance application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Conduct regular team meetings to ensure project progress- Stay updated on industry trends and technologies Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll- Strong understanding of SAP HR modules- Experience in SAP Payroll configuration and processing- Knowledge of SAP Payroll schemas and rules- Experience in SAP Payroll reporting- Hands-on experience in SAP Payroll integration Additional Information:- The candidate should have a minimum of 5 years of experience in SAP HCM Payroll- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement SAP HCM Payroll solutions.- Collaborate with cross-functional teams to ensure successful application integration.- Conduct testing and debugging of applications to ensure optimal performance.- Provide technical support and guidance to end-users.- Stay updated on industry trends and best practices to enhance application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll.- Strong understanding of SAP Payroll configuration and customization.- Experience in SAP Payroll schema and rules configuration.- Knowledge of SAP Payroll reporting and data analysis.- Hands-on experience in SAP Payroll implementation and support. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP HCM Payroll.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM WebSphere DataPower Good to have skills : Product and Market StrategyMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with team members to ensure successful project delivery and application functionality. Roles & Responsibilities:-Expertise in XSLT or GatewayScript:Proficient in using XSLT for transforming and processing data.-REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.-Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol -Gateways (MPG), and XML Firewalls.-XML and Related Technologies:Strong knowledge of XML, WSDL, XSLT, JSON, XML Schema, and XPATH. Professional & Technical Skills: - Must To Have Skills: Expertise in XSLT or GatewayScript:Proficient in using GatewayScript for transforming and processing data.- Good To Have Skills: Strong understanding of REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.- Familiarity with Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.- Strong knowledge of JSON & schema-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG) Additional Information:- The candidate should have a minimum of 3 years of experience in IBM WebSphere DataPower.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM WebSphere DataPower Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality. Roles & Responsibilities:-Expertise in XSLT or GatewayScript:Proficient in using XSLT for transforming and processing data.-REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.-Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG), and XML Firewalls.-XML and Related Technologies:Strong knowledge of XML, WSDL, XSLT, JSON, XML Schema, and XPATH. Professional & Technical Skills: - Must To Have Skills: Expertise in XSLT or GatewayScript:Proficient in using GatewayScript for transforming and processing data.- Good To Have Skills: Strong understanding of REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.- Familiarity with Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.- Strong knowledge of JSON & schema-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG) Additional Information:- The candidate should have a minimum of 3 years of experience in IBM WebSphere DataPower.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM WebSphere DataPower Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop solutions that align with business needs and requirements. Roles & Responsibilities:-Expertise in XSLT or GatewayScript:Proficient in using XSLT for transforming and processing data.-REST and SOAP Web Services:Extensive experience in developing and managing REST-based and --SOAP-based web services using IBM DataPower.-Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG), and XML Firewalls.-XML and Related Technologies:Strong knowledge of XML, WSDL, XSLT, JSON, XML Schema, and XPATH. Professional & Technical Skills: - Must To Have Skills: Expertise in XSLT or GatewayScript:Proficient in using GatewayScript for transforming and processing data.- Good To Have Skills: Strong understanding of REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.- Familiarity with Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.- Strong knowledge of JSON & schema-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG) Additional Information:- The candidate should have a minimum of 5 years of experience in IBM WebSphere DataPower- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM Netezza Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions to meet business needs.- Conduct code reviews and ensure compliance with coding standards.- Troubleshoot and debug applications to optimize performance.- Stay updated on emerging technologies and trends to enhance application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Netezza.- Strong understanding of database management and SQL queries.- Experience in ETL processes and data warehousing concepts.- Hands-on experience in performance tuning and optimization of database queries.- Good To Have Skills: Experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in IBM Netezza.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze requirements and develop software solutions.- Design, develop, and test applications using Ab Initio to ensure high performance and reliability.- Troubleshoot and debug applications to optimize functionality and enhance user experience.- Document technical specifications and provide support during implementation and maintenance phases.- Stay updated with industry trends and best practices to continuously improve application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality management and data governance principles.- Knowledge of SQL and database management systems.- Good To Have Skills: Experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France