Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 years
0 Lacs
hyderābād
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 20 hours ago
1.0 years
0 Lacs
bengaluru
On-site
DESCRIPTION Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. Amazon India is launching a new service, Strategic Brand services aimed at offering dedicated support to top-tiered brands to grow with Amazon. Under this service, Brand Specialists will work on identifying and improving key customer inputs for growth such as content, marketing and stock availability among others. Apart from this, the Brand Specialists will also help brands leverage Amazon’s tools and programs to improve on their business inputs. We are seeking creative, goal-oriented and highly entrepreneurial people to join our exciting and fast-paced team. About the Role: As a Brand Specialist, you will focus on delivering 5 core focus areas for the brand: Selection, demand generation, catalogue quality, business advice and availability. The person who joins the leadership team in this position must share our passion and commitment for serving our customers. This ideal candidate should have experience in forging and building brand relationships. Some understanding of planning product cycles and selling online is preferred. The right candidate will be flexible, action and results oriented, self-starting and have strong analytical skills. He or she must have a proven track record in taking ownership, driving results and moving with speed to implement ideas in a fast-paced environment. He should be entrepreneurial with the confidence to make independent, data-driven decisions. The candidate must demonstrate the ability to succeed at: planning and forecasting, and driving an online business. The candidate must be an effective communicator in working with some of Amazon’s most important partners and vendors, as well as with internal colleagues and groups. Responsibilities This person will have responsibility for: Building selection: Identify selection gaps. Track brand’s offline catalogue to ensure all relevant selection is present on Amazon. Demand generation: Responsible for demand generation. This includes working with other members on the category management team to create a marketing calendar based on vendor's objectives Business Advice: Support participation of brand in Amazon programs Availability: Ensuring continuous availability of products Catalogue Quality on Amazon: Ensuring the best input from brand is updated for customer interface on Amazon Detail Pages through perfect Images, Product descriptions, etc. Key job responsibilities Job summary Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. The Brand Specialist will work on offering dedicated support to top-tiered brands to grow with Amazon by identifying and improving key customer inputs for growth such as content, marketing and stock availability among others. Apart from this, the Brand Specialists will also help brands leverage Amazon’s tools and programs to improve on their business inputs. We are seeking creative, goal-oriented and highly entrepreneurial people to join our exciting and fast-paced team. About the Role: As a Brand Specialist, you will focus on delivering 5 core focus areas for the brand: Selection, demand generation, catalogue quality, business advice and availability. The person who joins the leadership team in this position must share our passion and commitment for serving our customers. This ideal candidate should have experience in forging and building brand relationships. Some understanding of planning product cycles and selling online is preferred. The right candidate will be flexible, action and results oriented, self-starting and have strong analytical skills. He or she must have a proven track record in taking ownership, driving results and moving with speed to implement ideas in a fast-paced environment. He should be entrepreneurial with the confidence to make independent, data-driven decisions. The candidate must demonstrate the ability to succeed at: planning and forecasting, and driving an online business. The candidate must be an effective communicator in working with some of Amazon’s most important partners and vendors, as well as with internal colleagues and groups. Responsibilities This person will have responsibility for: Building selection: Identify selection gaps. Track brand’s offline catalogue to ensure all relevant selection is present on Amazon. Demand generation: Responsible for demand generation. This includes working with other members on the category management team to create a marketing calendar based on vendor's objectives Business Advice: Support participation of brand in Amazon programs Availability: Ensuring continuous availability of products Catalogue Quality on Amazon: Ensuring the best input from brand is updated for customer interface on Amazon Detail Pages through perfect Images, Product descriptions, etc. BASIC QUALIFICATIONS 1+ years of account management, project or program management or buying experience Bachelor's degree Experience using analytical specific tools such as Google Analytics, SQL or HTML PREFERRED QUALIFICATIONS Experience in process improvement Experience managing large amounts of data Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 20 hours ago
1.0 years
0 Lacs
bengaluru
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 20 hours ago
10.0 years
0 Lacs
mumbai, maharashtra, india
On-site
🚀 We’re Hiring: Regional Sales Manager – Resiliency (Mumbai)📍 Mumbai, On-site, Full-time About itSimple itSimple is a niche leader in Backup & Archival Solutions, serving large Government & Enterprise customers across India. Partnering with global leaders like Veeam, Atempo, Druva, and Commvault, we are experiencing exponential growth in market share, revenue, and profitability. We’re selective about who joins the itSimple Family — and we nurture them to achieve their professional dreams. 🔑 Your Role (Mumbai) As Regional Sales Manager – Resiliency (Mumbai), you will lead the West India market, focusing on new business acquisition and expansion across BFSI, Pharma, and Large SIs (HPE, etc.). Drive sales strategy for resiliency solutions in Mumbai. Lead & mentor an existing 1-member sales team. Build deep client and partner relationships. Achieve ₹5 Cr annual revenue target in Year 1. 🎯 What We’re Looking For 10+ years in Enterprise/B2B Solution Sales. Strong industry network in BFSI, Pharma, Large System Integrators. Track record of delivering multi-crore revenue targets. Partner-driven sales execution. Bachelor’s/Master’s in Business, Marketing, or Technology. Knowledge of Backup/Archival domain preferred. Government sector exposure is a big plus. 🌟 Skills That Make You Shine Strategic selling Negotiation Team leadership Client management Communication Result orientation
Posted 22 hours ago
7.0 years
0 Lacs
surat, gujarat, india
Remote
About THRYL THRYL (by Glazer Games) is India’s social gaming & esports platform—ethical, non‑cash‑stakes, community‑first. We combine tournaments, creator‑led content, and a massive catalog of casual games with on‑platform rewards (THRYL Token) and offerwall incentives. We publish video content via Glazer Games channels and run a thriving Discord community. We’re building for scale with brand safety at the core: no gambling, no fantasy, no betting, no pay‑to‑enter tournaments . Why this role now We’re expanding our organic growth engine across Instagram, X, YouTube (via Glazer Games), Discord, and community surfaces. We need a player‑coach who can own strategy and ship content daily , activate creators, and turn viewers → users → advocates —working closely with Esports Production, UA/Growth, and Product. Role overview: Title: Social & Content Lead (Manager) Function: Marketing (Organic Growth / Content / Community) Reports to: Head of Marketing / Growth Location: Surat, Gujarat Schedule: Full‑time; some evenings/weekends for live events Key outcomes (first 90 days) Ship a refreshed channel strategy with weekly calendars for IG, X, YT, Discord, LinkedIn. Establish a content lab : hooks, thumbnails, shorts workflow, A/Bs, and a clean asset tracker. Launch creator pilots (≥ 6 per month) with clear CTAs (install, join Discord, watch live). Build a highlights pipeline from esports/streams → 20–30 shorts/reels per month. Deliver a weekly KPI report with insights and actions. Key outcomes (6 months) MoM growth on primary channels; improved watch time , ER , and CTR vs baseline. Repeatable creator program (≥ 10 active creators/month) with predictable deliverables. Discord : consistent events cadence; improved activation & retention. Organic social contributing meaningful installs / in‑app actions (tracked via UTMs/postbacks where possible). Zero policy violations : brand‑safe and compliant content at scale. Responsibilities (what you’ll own) 1) Strategy & Planning Channel‑wise strategy (IG, X, YT via Glazer Games, Discord, LinkedIn) mapped to funnel goals. Monthly/weekly calendars, campaign timelines, and content priorities (product beats, esports, community). Narrative arcs around creator collabs, tournaments, feature drops, and tentpole moments. 2) Content Production & Publishing Briefs, scripting, editing supervision (or hands‑on for shorts), thumbnail craft. Turn long‑form/livestream assets into snackable highlights, memes, and recaps. On‑time publishing, pinning, and cross‑posting; UTM hygiene and basic SEO for YT. Maintain a single source of truth : asset library, naming, versioning, rights, and clear approvals. 3) Creator & Influencer Program Source, vet, and onboard creators aligned to our brand safety standards. Negotiate deliverables and rates; draft briefs and track outputs in a shared tracker. Grow a dependable bench across Hindi & regional languages; pilot revenue‑share/affiliate experiments. Post‑campaign reports with learnings and next‑step proposals. 4) Community & Discord Ops Daily prompts, polls, AMAs, and creator spotlights; weekly challenges & leaderboards. Moderation and safety: enforce community rules, escalate issues quickly. Coordinate with Campus Ambassadors and power users to amplify events and UGC. 5) Live & Esports Content Work with Broadcast/Production to pre‑plan shots and storylines for highlight capture. Create real‑time social: countdowns, live tweets, clip‑drops, MVP spotlights, post‑match reels. Turn tournament outcomes into evergreen content (top plays, pro tips, micro‑lessons). 6) Measurement & Insights Weekly dashboards: follower growth, ER, watch time, CTR, installs, Discord health. Content lab: A/B hooks & thumbnails; test posting times & formats; iterate fast. Extract insights that feed back into Product/UA (what converts, which narratives work). 7) Governance, Brand Safety & Compliance Zero tolerance for gambling/fantasy/betting or paid entry fee narratives. Adhere to THRYL policies (Terms of Use, Token & Conversion, Offerwall, Community Guidelines, Brand Safety & UA Addendum, Child Safety & Age Gate, Cookie/Tracking). Age‑appropriate creative; accurate disclaimers on rewards & redemptions; responsible influencer disclosures. KPIs & health metrics Audience Growth : MoM follower growth by channel. Engagement : ER/post & ER/video; avg watch time (incl. 30‑sec hold). Acquisition : CTR to store/landing; installs from social; CPI/CPE benchmarks (where trackable). Community : Discord joins, DAUs, message velocity, event participation. Throughput : planned vs published; on‑time rate; creator deliverables completion. Safety : 0 takedowns, 0 policy strikes, 0 age‑gate violations. We set baselines together in Month 1; targets will be ambitious but realistic for each channel’s current state. Must‑have qualifications 4–7 years in social/content for gaming/esports/consumer apps or youth brands. A portfolio of shorts/reels/threads that demonstrably moved metrics. Hands‑on editing (CapCut/Premiere) and thumbnail craft; strong hook & copywriting skills. Analytics fluency: IG Insights, YT Studio, X analytics; UTM discipline; comfort with GA4. End‑to‑end creator ops: sourcing → briefs → contracts → tracking → reporting. Experience running Discord communities and live/social coverage for events. Clear understanding of India’s online gaming sensitivities and ad/creator disclosure norms. Good‑to‑have Managed a 50k+ member Discord; regional language content ops. Basic motion/design (AE/Canva/Figma) proficiency. Familiarity with PostHog/Mixpanel, AppsFlyer/Branch, or similar attribution tools. Esports tournament content experience; broadcast coordination. Tool stack (we use/accept equivalents) Planning/PM: ClickUp/Notion, Google Workspace. Editing/Design: CapCut, Premiere Pro, Photoshop/Canva, Figma. Publishing/Listening: Native schedulers; Hootsuite/Later/Sprout optional. Analytics: IG/X native, YouTube Studio, GA4; UTM builder; (nice) PostHog/Mixpanel. Utilities: Frame.io/Drive for review; Bitly; Discord bots for moderation. Working model & benefits Location: India (Hybrid/Remote); travel for shoots/lan events as needed. Hours: Standard weekdays + event‑aligned evenings/weekends. Compensation: Competitive salary + performance bonus; equipment & learning stipend. Perks: Game credits, creator passes, and access to esports & studio shoots. Growth path 12–18 months: Step up to Senior Lead owning organic growth OKRs. 18–24 months: Build a content pod (editor + designer + community exec) and/or move toward Head of Content/Community . Process & rituals Weekly: Content review & KPI stand‑up; creator pipeline check‑in. Bi‑weekly: Experiment review (what we A/B’d, what we learned). Monthly: Retrospective and next‑month plan; leadership read‑out. Interview process 20‑min intro screen (fit & portfolio walkthrough). 45‑min craft interview (hooks, thumbnails, analytics stories). Take‑home (48–72 hrs): see below. 60‑min panel with Growth, Esports, and Creator Ops. Founder conversation; references; offer. Sample take‑home (2–3 hrs cap) Part A (Strategy): 2‑week calendar (IG, YT Shorts, X) for a mini‑tournament & feature launch. Part B (Craft): Write 5 hooks and 3 thumbnail headlines for one gameplay highlight; mock one thumbnail. Part C (Analysis): Review a provided dashboard; list 5 insights and 5 actions. How to apply Send your resume/LinkedIn and a 10‑slide or 2‑page portfolio: best posts, before/after thumbnails, short video samples, and one mini‑case on a channel you grew (goals → tactics → results). Email to anand@glazer.games with subject: “THRYL — Social & Content Lead — YourName”.
Posted 23 hours ago
4.0 - 8.0 years
0 Lacs
coimbatore, tamil nadu, india
On-site
Job Title : ETL Developers. Job Location : Coimbatore. Type : WFO. Job Description Key Responsibilities : ETL Design And Development Design and develop efficient, scalable SSIS packages to extract, transform, and load data between systems. Translate business requirements into technical ETL solutions using data flow and control flow logic. Develop reusable ETL components that support modular, configuration-driven architecture. Data Integration And Transformation Integrate data from multiple heterogeneous sources : SQL Server, flat files, APIs, Excel, etc. Implement business rules and data transformations such as cleansing, standardization, enrichment, and deduplication. Manage incremental loads, full loads, and slowly changing dimensions (SCD) as required. SQL And Database Development Write complex T-SQL queries, stored procedures, and functions to support data transformations and staging logic. Perform joins, unions, aggregations, filtering, and windowing operations effectively for data preparation. Ensure referential integrity and proper indexing for performance. Performance Tuning Optimize SSIS packages by tuning buffer sizes, using parallelism, and minimizing unnecessary transformations. Tune SQL queries and monitor execution plans for efficient data movement and transformation. Implement efficient data loads for high-volume environments. Error Handling And Logging Develop error-handling mechanisms and event logging in SSIS using Event Handlers and custom logging frameworks. Implement restartability, checkpoints, and failure notifications in workflows. Testing And Quality Assurance Conduct unit and integration testing of ETL pipelines. Validate data outputs against source and business rules. Support QA teams in user acceptance testing (UAT) and defect resolution. Deployment And Scheduling Package, deploy, and version SSIS solutions across development, test, and production environments. Schedule ETL jobs using SQL Server Agent or enterprise job schedulers (e.g., Control-M, Tidal). Monitor and troubleshoot job failures and performance issues. Documentation And Maintenance Maintain documentation for ETL designs, data flow diagrams, transformation logic, and job schedules. Update job dependencies and maintain audit trails for data pipelines. Collaboration And Communication Collaborate with data architects, business analysts, and reporting teams to understand data needs. Provide technical support and feedback during requirements analysis and post deployment support. Participate in sprint planning, status reporting, and technical reviews. Compliance And Best Practices Ensure ETL processes comply with data governance, security, and privacy regulations (HIPAA, GDPR, etc.) Follow team coding standards, naming conventions, and deployment protocols. Required Skills & Experience 4 - 8 years of hands-on experience with ETL development using SSIS. Strong SQL Server and T-SQL skills. Solid understanding of data warehousing concepts and best practices. Experience with flat files, Excel, APIs, or other common data sources. Familiarity with job scheduling and monitoring (e.g., SQL Agent). Strong analytical and troubleshooting skills. Ability to work independently and meet deadlines. Preferred Skills Exposure to Azure Data Factory or cloud-based ETL tools. Experience with Power BI or other reporting platforms. Experience in healthcare, finance, or regulated domains is a plus. Knowledge of version control tools like Git or Azure DevOps. (ref:hirist.tech)
Posted 1 day ago
0.0 - 3.0 years
0 Lacs
hyderabad, telangana
On-site
As a STACK Developer at our company, you will be responsible for the following: - Utilizing your expertise in MySQL, MVC Architecture, Mongo DB, and Node JS. - Demonstrating proficiency in Ext Js, jQuery, Angular Js, Bootstrap, JavaScript, Vue.JS, and REACT JS. - Applying strong knowledge in OOPs, Open Source Concepts, and understanding of HTML, CSS, and JavaScript (AJAX, JSON, DOM). - Working with Node JS and at least one Front End JavaScript Framework. - Showcasing strong DBMS knowledge. - Mastering MySQL including Triggers, Views, Joins, Stored Procedures, Database Design, and Normalisation. To excel in this role, you must possess the following qualifications: - 0-1 years of experience in a similar role. - Knowledge and experience in MySQL, MVC Architecture, Mongo DB, and Node JS. - Proficiency in Ext Js, jQuery, Angular Js, Bootstrap, JavaScript, Vue.JS, and REACT JS. - Strong understanding of OOPs, Open Source Concepts, HTML, CSS, and JavaScript. - Mastery in MySQL including Triggers, Views, Joins, Stored Procedures, Database Design, and Normalisation. If you are passionate about STACK development and possess the required skills and qualifications, we encourage you to apply for this full-time position with a notice period of immediate to 15 days. Send your resume to hr@sedots.com to be considered for one of the 4-6 available positions.,
Posted 1 day ago
7.0 years
0 Lacs
india
On-site
About Latinum: Latinum is hiring for multiple backend engineering roles. You must demonstrate strong capabilities in either Core Java backend engineering or Microservices and Cloud architecture , with working knowledge in the other. Candidates with strengths in both areas will be considered for senior roles. You will be part of a high-performance engineering team solving complex business problems through robust, scalable, and high-throughput systems. Experience : Minimum 7+ years of hands on experience is mandatory. Java & Backend Engineering Java 8+ (Streams, Lambdas, Functional Interfaces, Optionals) Spring Core, Spring Boot, object-oriented principles, exception handling, immutability Multithreading (Executor framework, locks, concurrency utilities) Collections, data structures, algorithms, time/space complexity Kafka (producer/consumer, schema, error handling, observability) JPA, RDBMS/NoSQL, joins, indexing, data modelling, sharding, CDC JVM tuning, GC configuration, profiling, dump analysis Design patterns (GoF – creational, structural, behavioral) Microservices, Cloud & Distributed Systems REST APIs, OpenAPI/Swagger, request/response handling, API design best practices Spring Boot, Spring Cloud, Spring Reactive Kafka Streams, CQRS, materialized views, event-driven patterns GraphQL (Apollo/Spring Boot), schema federation, resolvers, caching Cloud-native apps on AWS (Lambda, IAM, S3, containers) API security (Oauth 2.0, JWT, Keycloak, API Gateway configuration) CI/CD pipelines, Docker, Kubernetes, Terraform Observability with ELK, Prometheus, Grafana, Jaeger, Kiali Additional Skills (Nice to Have) Node.js, React, Angular, Golang, Python, GenAI Web platforms: AEM, Sitecore Production support, rollbacks, canary deployments TDD, mocking, Postman, security/performance test automation Architecture artifacts: logical/sequence views, layering, solution detailing Key Responsibilities: Design and develop scalable backend systems using Java and Spring Boot Build event-driven microservices and cloud-native APIs Implement secure, observable, and high-performance solutions Collaborate with teams to define architecture, patterns, and standards Contribute to solution design, code reviews, and production readiness Troubleshoot, optimize, and monitor distributed systems in production Mentor junior engineers (for senior roles).
Posted 1 day ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Summary We are seeking a highly skilled and detail-oriented Senior SQL Data Analyst to join our data-driven team. This role will be responsible for leveraging advanced SQL skills to extract, analyze, and interpret complex datasets, delivering actionable insights to support business decisions. You will work closely with cross-functional teams to identify trends, solve problems, and drive data-informed strategies across the organization. Key Responsibilities Develop, write, and optimize advanced SQL queries to retrieve and analyze data from multiple sources. Design and maintain complex data models, dashboards, and reports. Collaborate with stakeholders to understand business needs and translate them into analytical requirements. Conduct deep-dive analysis to identify key business trends and opportunities for growth or improvement. Ensure data integrity and accuracy across systems and reporting tools. Automate recurring reports and develop scalable data pipelines. Present findings in a clear, compelling way to both technical and non-technical audiences. Qualifications Required: Bachelor's degree in Computer Science, Information Systems, Mathematics, Statistics, or related field. 5+ years of experience in data analysis or a similar role with a strong focus on SQL. Expert proficiency in SQL (window functions, joins, CTEs, indexing, etc.). Strong understanding of data warehousing concepts and relational database systems (e.g., PostgreSQL, SQL Server, Snowflake, Redshift). Experience with BI tools like Tableau, Power BI, or Looker. Excellent analytical, problem-solving, and communication skills. Preferred Experience with scripting languages (Python, R) for data manipulation. Familiarity with cloud data platforms (AWS, Azure). Knowledge of ETL tools and best practices. Previous experience in a fast-paced, agile environment.
Posted 1 day ago
4.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Seaspan teams are goal-driven and share a high-performance culture, focusing on building services offerings to become a leading asset manager. Seaspan provides many of the world's major shipping lines with alternatives to vessel ownership by offering long-term leases on large, modern containerships and pure car, truck carriers (PCTCs) combined with industry leading ship management serves. Seaspan's fleet has evolved over time to meet the varying needs of our customer base. We own vessels in a wide range of sizes, from 2,500 TEU to 24,000 TEU vessels. As a wholly owned subsidiary of Atlas Corp, Seaspan delivers on the company's core strategy as a leading asset management and core infrastructure company. Position description: We are seeking a highly skilled and versatile Cloud Data Specialist to join our Data Operations team. Reporting to the Team Lead, Data Operations, the Cloud Data Specialist plays a key role in the development, administration, and support of our Azure-based data platform, with a particular focus on Databricks, data pipeline orchestration using tools like Azure Data Factory (ADF), and environment management using Unity Catalog. A strong foundation in data engineering, cloud data administration, and data governance is essential. Development experience using SQL and Python is required. Knowledge or experience with APIM is nice to have. Primary responsibilities: Data Engineering and Platform Management: Design, develop, and optimize scalable data pipelines using Azure Databricks and ADF. Administer Databricks environments, including user access, clusters, and Unity Catalog for data lineage, governance, and security. Support the deployment, scheduling, and monitoring of data workflows and jobs in Databricks and ADF. Implement best practices for CI/CD, version control, and operational monitoring for pipeline deployments. Implement and manage Delta Lake to ensure reliable, performant, and ACID-compliant data operations. Data Modeling and Integration: Collaborate with business and data engineering teams to design data models that support analytics and reporting use cases. Support integration of data from multiple sources into the enterprise data lake and data warehouse Configure API calls to utilize our Azure APIM platform. Maintain and enhance data quality, structure, and performance within the Lakehouse and warehouse architecture. Collaboration and Stakeholder Engagement: Work cross-functionally with business units, data scientists, BI analysts, and other stakeholders to understand data requirements. Translate technical solutions into business-friendly language and deliver clear documentation and training when required. Required Technical Expertise: Apache Spark (on Databricks) Proficient in PySpark and spark SQL Spark optimization techniques (caching, partitioning, broadcast joins) Writing and scheduling notebooks/jobs in Databricks Understanding of Delta Lake architecture and features Working with Databricks Workflows (pipelines and job orchestration) SQL/Python Programming Handling JSON, XML, and other semi-structured formats Experience with API integration using requests, http, etc. Error handling and logging API Ingestion Designing and implementing ingestion pipelines for RESTful API Transforming and loading JSON responses to Spark tables Cloud & Data Platform Skills Databricks on Azure Cluster configuration and management Unity Catalog features (optional but good to have) Azure Data Factory Creating and managing pipelines for orchestration Linked services and datasets for ADLS, Databricks, SQL Server Parameterized and dynamic ADF pipelines Triggering Databricks notebooks from ADF Data Engineering Foundations Data modeling and warehousing concepts ETL/ELT design patterns Data validation and quality checks Working with structured and semi-structured data (JSON, Parquet, Avro) DevOps & CI/CD Git/GitHub for version control CI/CD using Azure DevOps or GitHub Actions for Databricks jobs Infrastructure-as-code (Terraform for Databricks or ADF) Additional Requirements: Bachelor's degree in computer science, information systems, or a related field. 4+ years of experience in a cloud data engineering, data platform, or analytics engineering role. Familiarity with data governance, security principles, and data quality best practices. Excellent analytical thinking and problem-solving skills. Strong communication skills and ability to work collaboratively with technical and non-technical stakeholders. Microsoft certifications in Azure Data Engineer, Power Platform, or related field is desired Experience with Azure APIM is nice to have Knowledge of enterprise data architecture and data warehouse principles (e.g., dimensional modeling) an asset Job Demands and/or Physical Requirements: As Seaspan is a global company, occasional work outside of regular office hours may be required. Compensation and Benefits package: Seaspan’s total compensation is based on our pay-for-performance philosophy that rewards team members who deliver on and demonstrate our high-performance culture. The exact base salary offered will be commensurate with the incumbent’s experience, job-related skills and knowledge, and internal pay equity. Seaspan Corporation is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, race, color, religion, gender, sexual orientation, gender identity, national origin, disability, or protected Veteran status. We thank all applicants in advance. If your application is shortlisted to be included in the interview process, one of our team will be in contact with you. Please note that while this position is open in both Vancouver and Mumbai, it represents a single headcount. The role will be filled in one of the two locations based on candidate availability and suitability, determined by the hiring team.
Posted 1 day ago
0 years
0 Lacs
gurgaon
On-site
JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. 1. Core Java · Strong understanding of Java SE 8+ (OOP, Collections, Exception Handling, Generics, Streams, Multithreading). · Knowledge of design patterns (Singleton, Factory, DAO, etc.). 2. Spring MVC · Understanding of MVC architecture. · Controllers, Models, Views. · Dependency Injection & Inversion of Control. · Spring configuration (XML & Annotation-based). · Form handling & validation. · RESTful services with Spring MVC. 3. Hibernate (ORM) · Mapping entities with annotations/XML. · JPA (Java Persistence API). · CRUD operations with Hibernate. · HQL (Hibernate Query Language). 4. Frontend Basics · JavaScript: DOM manipulation, events, ES6 basics. · jQuery: Selectors, AJAX calls, event handling. · HTML5 and CSS3 basics for UI integration. 5. Database Skills · PostgreSQL. · Schema design, indexing, joins. · Writing SQL queries, stored procedures, functions. · Knowledge of database transactions and database optimization. 6. Tools & Build Systems · IDE: Eclipse, IntelliJ IDEA. · Build Tools: Maven or Gradle. · Version Control: Git/GitHub or GitLab. · Application Servers: Tomcat. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. QUALIFICATIONS B-Tech
Posted 1 day ago
3.0 - 5.0 years
4 - 6 Lacs
india
On-site
Job Title: Full Stack DOTNET Developer (Females Only) Experience: 3 to 5 Years Location: Hyderabad/Punjagutta (Only candidates from Telangana apply) Industry: Real Estate / PropTech Department: IT & Software Development Gender: Females Job Type: Full-time Working Mode: On-site About Us: We are a fast-growing real estate company leveraging technology to streamline property transactions, enhance customer experience, and enable data-driven decision-making. We are looking for a proactive and technically sound Dotnet Full Stack Developer to strengthen our in-house tech capabilities and work on custom-built ERP, CRM, and customer portals tailored for real estate workflows. Key Responsibilities: * Develop scalable backend logic using ASP.NET Core / ASP.NET MVC / Web API and integrate with frontend frameworks like Angular or React * Build dynamic front-end interfaces that support smooth user interactions for both internal teams and customers. * Create RESTful APIs to facilitate integrations with marketing platforms, payment gateways, and third-party tools. * Ensure data integrity and security of sensitive customer and property data. * Optimize SQL Server queries for performance across high-traffic real estate portals and dashboards. * Collaborate with cross-functional teams including sales, marketing, CRM, and finance to deliver user-friendly solutions. * Participate in code reviews, testing, deployment, and production support. Tech Stack Requirements: * Strong experience with ASP.NET Core, C#, Entity Framework; * Strong SQL knowledge for SQL Server (stored procedures, joins, triggers, indexing) * Proficiency in Angular or React * Strong knowledge of HTML5, CSS3, JavaScript, Bootstrap * Familiarity with REST APIs, JSON, and AJAX * Version control systems like Git * Understanding of Agile methodology Qualifications: * B.Tech or equivalent degree in Computer Science or related field if you are interested then get in touch with the HR and get your interview scheduled @9133367000 (Mon - Sat 10am to 5pm only) Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹600,000.00 per year Benefits: Cell phone reimbursement Provident Fund Work Location: In person
Posted 1 day ago
1.0 years
0 Lacs
hyderābād
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
0 years
0 Lacs
andhra pradesh
On-site
Primary Skills Experience in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Decompose Existing Essbase Cubes(Understand cube structure and logic. Document all metadata and business rules. Prepare for migration or rebuild on modern platforms) Secondary Skills Conceptual knowledge of ETL and data warehousing, working knowledge is added advantage Basic knowledge of Java is added advantage. JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools. Provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Experience in Risk or finance technology area. Strong SQL knowledge, involving complex joins and analytical functions About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 day ago
3.0 - 5.0 years
4 - 6 Lacs
punjagutta, hyderabad, telangana
On-site
Job Title: Full Stack DOTNET Developer (Females Only) Experience: 3 to 5 Years Location: Hyderabad/Punjagutta (Only candidates from Telangana apply) Industry: Real Estate / PropTech Department: IT & Software Development Gender: Females Job Type: Full-time Working Mode: On-site About Us: We are a fast-growing real estate company leveraging technology to streamline property transactions, enhance customer experience, and enable data-driven decision-making. We are looking for a proactive and technically sound Dotnet Full Stack Developer to strengthen our in-house tech capabilities and work on custom-built ERP, CRM, and customer portals tailored for real estate workflows. Key Responsibilities: * Develop scalable backend logic using ASP.NET Core / ASP.NET MVC / Web API and integrate with frontend frameworks like Angular or React * Build dynamic front-end interfaces that support smooth user interactions for both internal teams and customers. * Create RESTful APIs to facilitate integrations with marketing platforms, payment gateways, and third-party tools. * Ensure data integrity and security of sensitive customer and property data. * Optimize SQL Server queries for performance across high-traffic real estate portals and dashboards. * Collaborate with cross-functional teams including sales, marketing, CRM, and finance to deliver user-friendly solutions. * Participate in code reviews, testing, deployment, and production support. Tech Stack Requirements: * Strong experience with ASP.NET Core, C#, Entity Framework; * Strong SQL knowledge for SQL Server (stored procedures, joins, triggers, indexing) * Proficiency in Angular or React * Strong knowledge of HTML5, CSS3, JavaScript, Bootstrap * Familiarity with REST APIs, JSON, and AJAX * Version control systems like Git * Understanding of Agile methodology Qualifications: * B.Tech or equivalent degree in Computer Science or related field if you are interested then get in touch with the HR and get your interview scheduled @9133367000 (Mon - Sat 10am to 5pm only) Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹600,000.00 per year Benefits: Cell phone reimbursement Provident Fund Work Location: In person
Posted 2 days ago
0 years
0 Lacs
bengaluru, karnataka, india
On-site
Company Description enables retail investors in India, the Middle East, and South-East Asia to globalize their savings and wealth by investing in mature international markets like the US. By using Borderless, users can digitally open, own, and operate overseas investing accounts. Our platform is backed by big data-based automated research and analysis, helping investors make smarter decisions. Description Borderless also assists in managing portfolios more efficiently and discovering interesting investment opportunities daily. We are funded by seasoned investors with substantial experience in the global financial services Description : This is a full-time on-site role for a QA Engineer located in Bengaluru. We are seeking a highly skilled and detail-oriented QA Engineer with strong manual testing expertise and solid database knowledge. Test Planning & Execution The ideal candidate will be passionate about ensuring product quality across web and API platforms, capable of understanding business requirements end-to-end, and eager to collaborate with cross-functional teams. A basic understanding of automation frameworks is desirable to help scale testing Responsibilities : Analyze requirements, create comprehensive test plans, test cases, and test data. Perform functional, integration, regression, system, smoke, and UAT testing. Ensure complete test coverage across web, API, and database Management : Identify, document, and track defects through to resolution using standard tools (e.g., JIRA). Provide clear, concise defect reports with steps to reproduce and impact Testing : Write and execute complex SQL queries to validate data integrity, relationships, and backend processes. Perform CRUD validations, stored procedure testing, and data consistency checks. Verify data flows and perform validations in MongoDB Support (Basic) : Understand and contribute to automation frameworks in Java, Selenium, and Rest Assured at a basic level. Collaborate with automation engineers to identify test scenarios for automation. Work closely with developers, product managers, and business analysts to clarify requirements and reproduce issues. Participate in daily stand-ups, sprint planning, and retrospective meetings in an Agile/Scrum Skills & Qualifications : Strong understanding of the complete SDLC and STLC. Expertise in test design techniques (boundary value analysis, equivalence partitioning, decision tables, etc.). Hands-on experience in web, mobile, and API testing. Proficient with defect tracking and test management Knowledge : SQL skills (joins, subqueries, aggregates). Working knowledge of MongoDB(CRUD operations, aggregation Basics : Good understanding of Java programming. Familiarity with automation frameworks like Selenium WebDriver and Rest Assured for API testing. (ref:hirist.tech)
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
pune, maharashtra, india
Remote
Experience Required : 5 - 8 Years Location : Remote (India-based candidates preferred, working EST hours) Availability : Immediate to 15 Days About The Role We are seeking a Data Issue Fix Engineer / Data Remediation Engineer to join our team. In this role, you will be responsible for identifying, analyzing, and fixing complex data issues across multiple systems and databases. You will work closely with cross-functional teams to ensure data accuracy, reliability, and reproducibility, particularly in the context of clinical trial and multi-tenant platforms. Key Responsibilities Troubleshoot and resolve complex data issues across multiple related tables. Write and optimize advanced SQL queries, including joins, CTEs, window functions, temp tables, and nested queries. Implement schema-level fixes, rollback logic, and work with migration tools such as Phinx or Flyway. Validate data using test cases to ensure consistency, accuracy, and reproducibility. Work on data remediation within clinical trial systems (CTMS, eSource, or similar platforms). Collaborate using Jira for ticket lifecycle management, tagging, and workflow tracking. Document root cause analysis clearly and communicate findings with stakeholders. Required Skills & Qualifications Strong hands-on expertise in Advanced SQL. Proven experience in debugging data issues across complex relational database systems. Experience with schema-level fixes, rollback logic, and migration tooling (Phinx, Flyway, Liquibase, etc.). Knowledge of data validation techniques and writing reproducible test cases. Familiarity with clinical trial systems, CTMS, eSource, or multi-tenant platforms (preferred). Experience with Jira and understanding of ticket lifecycle management. Excellent communication skills with the ability to document and explain root causes. Company Profile Dynamisch is Information Technology Solutions and Service Company focused on providing end-to-end outsourced product engineering services. Our areas of expertise include : Software product development, Mobile application development, Software Migration, and Re-engineering, Cloud enablement and QA & testing services. Dynamisch offers outsourced product development services to small and medium-sized technology companies and startups in areas like enterprise, cloud, web, social networking, media, and mobile applications. We combine our business domain experience, technology expertise, knowledge of the latest industry trends and quality-driven delivery model to cater to cross-vertical clients. We are a well-staffed, professional, and quality service provider, offering our technical expertise to augment your in-house technology capabilities. Our expert team of dedicated and diligent consultants, developers, designers and statisticians do more than simply use technology to create applications. We take time to truly understand our clients business : its ethos and objectives, which enables us to comprehend precise requirements and create solutions as per their vision. (ref:hirist.tech)
Posted 2 days ago
5.0 - 31.0 years
3 - 4 Lacs
hyderabad
On-site
Job description: About Company: Sree Enterprises is a leading 3PL firm in hyderabad providing services like C&F, warehousing. Job Description Warehouse Manager will be employed by Sree Enterprises and will be reporting to the management. The warehouse manager will be responsible for the entire warehouse operations of one client account. The warehouse manager will be working with the logistics team, salesforce team and other key executives from the client side. He will be managing a team of dispatch supervisors, billing operators, inward/outward clerks etc and report everyday work to the management. Responsibilities The manager is responsible for the total inventory in the warehouse. The manager has to follow Standard Operating Procedure(SOP). Work with the clients sales team and logistics team and solve problems if any. Work with company’s accounts team in preparing bills, etc. Work with supervisors and dispatch goods on time without delay. Arrange transportation by working with transporters. Requirements Good Communication skills – English, Telugu ( Hindi not compulsory but desired) Email Communication - should be able to read and draft emails to clients Computer skills – excel. SAP knowledge preferred but not compulsory. The manager should learn SAP once he joins. SAP training will be given Should learn the work required to carry day to day tasks quickly. Most importantly – the manager should be responsible, and sincere in executing the work. Job Type: Full-time Benefits: Health insurance Provident Fund Work Location: In person
Posted 2 days ago
0 years
0 Lacs
hyderabad, telangana, india
On-site
Company Description Blend360 is a data and AI services company specializing in data engineering, data science, MLOps, and governance to build scalable analytics solutions. It partners with enterprise and Fortune 1000 clients across industries including financial services, healthcare, retail, technology, and hospitality to drive data-driven decision making. Headquartered in Columbia, Maryland, the company is recognized for rapid growth and global delivery of AI solutions through the integration of people, data, and technology. We are seeking a hands-on Data Engineer with deep expertise in distributed systems, ETL/ELT development, and enterprise-grade database management. The engineer will design, implement, and optimize ingestion, transformation, and storage workflows to support the MMO platform. The role requires technical fluency across big data frameworks (HDFS, Hive, PySpark), orchestration platforms (NiFi), and relational systems (Postgres), combined with strong coding skills in Python and SQL for automation, custom transformations, and operational reliability. Job Description We are implementing a Media Mix Optimization (MMO) platform designed to analyze and optimize marketing investments across multiple channels. This initiative requires a robust on-premises data infrastructure to support distributed computing, large-scale data ingestion, and advanced analytics. The Data Engineer will be responsible for building and maintaining resilient pipelines and data systems that feed into MMO models, ensuring data quality, governance, and availability for Data Science and BI teams. The environment integrates HDFS for distributed storage, Apache NiFi for orchestration, Hive and PySpark for distributed processing, and Postgres for structured data management. This role is central to enabling seamless integration of massive datasets from disparate sources (media, campaign, transaction, customer interaction, etc.), standardizing data, and providing reliable foundations for advanced econometric modeling and insights. Responsibilities Data Pipeline Development & Orchestration Design, build, and optimize scalable data pipelines in Apache NiFi to automate ingestion, cleansing, and enrichment from structured, semi-structured, and unstructured sources. Ensure pipelines meet low-latency and high-throughput requirements for distributed processing. Data Storage & Processing Architect and manage datasets on HDFS to support high-volume, fault-tolerant storage. Develop distributed processing workflows in PySpark and Hive to handle large-scale transformations, aggregations, and joins across petabyte-level datasets. Implement partitioning, bucketing, and indexing strategies to optimize query performance. Database Engineering & Management Maintain and tune Postgres databases for high availability, integrity, and performance. Write advanced SQL queries for ETL, analysis, and integration with downstream BI/analytics systems. Collaboration & Integration Partner with Data Scientists to deliver clean, reliable datasets for model training and MMO analysis. Work with BI engineers to ensure data pipelines align with reporting and visualization requirements. Monitoring & Reliability Engineering Implement monitoring, logging, and alerting frameworks to track data pipeline health. Troubleshoot and resolve issues in ingestion, transformations, and distributed jobs. Data Governance & Compliance Enforce standards for data quality, lineage, and security across systems. Ensure compliance with internal governance and external regulations. Documentation & Knowledge Transfer Develop and maintain comprehensive technical documentation for pipelines, data models, and workflows. Provide knowledge sharing and onboarding support for cross- functional teams. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field (Master’s preferred). Proven experience as a Data Engineer with expertise in HDFS, Apache NiFi, Hive, PySpark, Postgres, Python, and SQL. Strong background in ETL/ELT design, distributed processing, and relational database management. Experience with on-premises big data ecosystems supporting distributed computing. Solid debugging, optimization, and performance tuning skills. Ability to work in agile environments, collaborating with multi-disciplinary teams. Strong communication skills for cross-functional technical discussions. Preferred Qualifications Familiarity with data governance frameworks, lineage tracking, and data cataloging tools. Knowledge of security standards, encryption, and access control in on- premises environments. Prior experience with Media Mix Modeling (MMM/MMO) or marketing analytics projects. Exposure to workflow schedulers (Airflow, Oozie, or similar). Proficiency in developing automation scripts and frameworks in Python for CI/CD of data pipelines.
Posted 2 days ago
3.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Application Support (Splunk Dev) Detailed Job Description Job Posting Title: Specialist, Application Support (Splunk Dev) Work Location – Noida OR Bangalore OR Chennai Experience - 3-6 Years Mandatory Skills - File Monitoring Setup Using Splunk What a Successful File Monitoring Setup Using Splunk Involves Dashboard Development & Management: Design and maintain advanced Splunk dashboards to deliver comprehensive insights into system performance and File Transmission component health. Performance Optimization: Improve dashboard efficiency when handling large datasets using techniques such as optimized queries, summary indexing, and data models. Advanced Regex Utilization: Apply sophisticated regular expressions to create accurate search queries and extract meaningful data. Custom Alert Configuration: Implement highly customized alerting mechanisms to detect anomalies, manage alert actions, throttle conditions, and integrate with lookup tables and dynamic time-based arguments. File Transmission Monitoring: Track and report on each stage of file transmission, continuously refining monitoring strategies for enhanced reliability and visibility. Cross-Functional Collaboration: Work closely with various teams to integrate Splunk monitoring with broader IT systems and workflows. Conduct discovery of file transmission workflows, including file life cycle, endpoint configurations, log analysis, SLA definitions, and exception scenarios. Develop and deploy advanced Splunk queries to ensure end-to-end visibility into file transmission processes. Configure and optimize alerting mechanisms for timely detection and resolution of issues. Design and implement IT Service Intelligence (ITSI) strategies to enhance monitoring capabilities and deliver actionable insights. Establish and manage monitoring frameworks based on the file life cycle to ensure traceability and accountability. Collaborate with IT and operations teams to integrate Splunk with other tools and resolve data ingestion issues. Analyze monitoring data to identify trends, detect anomalies, and recommend improvements. Serve as a Splunk subject matter expert, providing guidance, best practices, and training to team members. What You Will Need To Have Education: Bachelor’s and/or Master’s degree in Information Technology, Computer Science, or a related field. Experience: Minimum of 3+ years in IT, with a focus on Splunk, SFTP tools, data integration, or technical support roles. Splunk Expertise: Proficiency in advanced SPL techniques including subsearches, joins, and statistical functions. Regex Proficiency: Strong command of regular expressions for search and data extraction. Database Skills: Experience with relational databases and writing complex SQL queries with advanced joins. File Transmission Tools: Hands-on experience with platforms like Sterling File Gateway, IBM Sterling, or other MFT solutions. Analytical Thinking: Proven problem-solving skills and the ability to troubleshoot technical issues effectively. Communication: Strong verbal and written communication skills for collaboration with internal and external stakeholders. Attention to Detail: High level of accuracy to ensure data integrity and reliability. What Would Be Great To Have Scripting & Automation: Proficiency in Python or similar scripting languages to automate monitoring tasks. Tool Experience: Familiarity with tools such as Dynatrace, Sterling File Gateway, and other MFT solutions. Linux Proficiency: Strong working knowledge of Linux and command-line operations. Secure File Transfer Protocols: Hands-on experience with SFTP and tools like SFG, NDM, and MFT using SSH encryption. Task Scheduling Tools: Experience with job scheduling platforms such as AutoSys, Control-M, or cron. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 2 days ago
10.0 years
0 Lacs
hyderabad, telangana, india
On-site
We are Hiring for Business Analyst Experience: 6 – 10 Years Work Location: Hyderabad or Chennai or Bangalore Notice:0-20 Days Key Skills: Business Analyst, Banking domain, Finance, SQL Role Overview We are seeking a Risk & Finance Technology Specialist with strong expertise in SQL, database applications, and financial data models. The ideal candidate will play a key role in managing data flow, enhancing analytical systems, and supporting the migration of Essbase cubes to modern platforms. Key Responsibilities Design, analyze, and optimize data flow and data models for finance and risk technology applications. Write and optimize complex SQL queries , including advanced joins and analytical functions. Work with databases such as Oracle and Netezza to develop, maintain, and troubleshoot applications. Decompose existing Essbase Cubes : Analyze cube structures and logic. Document metadata, hierarchies, and business rules. Prepare solutions for migration or rebuild on modern platforms. Collaborate with business and technology teams to ensure data integrity, accuracy, and alignment with business requirements. Qualifications Proven experience in risk or finance technology domains. Strong proficiency in SQL and database concepts. Working knowledge of Oracle, Netezza, or similar databases . Solid understanding of data flows, data modeling, and database applications . Hands-on experience with Essbase cubes (design, structure, business rules, and migration). Strong problem-solving skills and attention to detail. If Interested apply at shalini.v@saranshinc.com #RiskTechnology #FinanceTechnology #SQL #DataModeling #DataEngineering #OracleDatabase #Netezza #Essbase #DataAnalytics
Posted 2 days ago
4.0 years
0 Lacs
hyderābād
On-site
Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer – ETL Tester Position: Senior Software Engineer Experience: 4- 6 Years Category: Software Testing Main location: Hyderabad/ Bangalore Employment Type: Full Time Your future duties and responsibilities - Big Data/ETL/DWH Testing experience with RDMBS and/or Hadoop with Reporting. Good in SQL to validate complex transformations, including but not limited to Joins, Various functions, CTE’s. Must have understanding of Python/PySpark Data Testing approach and frameworks with 1+ years of hands-on experience. Execute test solutions for Table and File ingestion ETL processes. Exposure to Real-Time streaming applications is also a plus. Candidates should be good in testing process, defect management and other aspects of Quality Engineering Life Cycle. Defect Lifecycle Management, Develop and Maintain Test Artifacts throughout the Testing Life Cycle JIRA/XRAY or similar Agile Project and Test management tool experience Required qualifications to be successful in this role Qualification: Bachelor’s degree in computer science or related field or higher with minimum 3 years of relevant experience. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 2 days ago
1.0 years
4 - 9 Lacs
hyderābād
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 days ago
3.0 years
0 Lacs
hyderābād
On-site
Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: SE / Quality Engineer – Junior – Data Testing Position: Software Engineer Experience: 3+ Years Category: Testing / Quality Assurance Main Location: Hyderabad Position ID: J0725-0099 Employment Type: Full Time We are looking for an experienced Data Testing to join our team. The ideal candidate should be passionate about coding and developing scalable and high-performance applications. You will work closely with our front-end developers, designers, and other members of the team to deliver quality solutions that meet the needs of our clients. Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Your future duties and responsibilities Responsibilities: Execute Big Data, ETL, and DWH testing with RDBMS and/or Hadoop reporting systems. Write and validate SQL queries for complex transformations (Joins, Functions, CTEs). Apply Python/PySpark Data Testing approaches and frameworks. Execute test solutions for Table and File ingestion ETL processes. Provide support for Real-Time streaming application testing (a plus). Perform defect lifecycle management and track issues accurately. Develop and maintain test artifacts throughout the testing lifecycle. Collaborate with Agile teams using JIRA/XRAY or similar project/test management tools. Must-Have Skills: 3+ years of experience in Big Data / ETL / DWH Testing with RDBMS and/or Hadoop reporting. Strong SQL skills to validate complex transformations (Joins, Functions, CTEs). 1+ years of hands-on experience with Python/PySpark Data Testing frameworks. Experience with defect management and testing lifecycle. Good exposure to Agile testing processes. Proficiency with JIRA/XRAY or similar Agile project/test management tools. Required qualifications to be successful in this role Good-to-Have Skills: Exposure to Real-Time streaming applications. Experience with automation in Data Testing. Knowledge of cloud-based data platforms (AWS, Azure, GCP). Strong problem-solving and analytical skills. Ability to work effectively in cross-functional Agile team CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodations for people with disabilities in accordance with provincial legislation. Please let us know if you require a reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 2 days ago
0 years
0 Lacs
andhra pradesh
On-site
Primary Skills Experience in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Decompose Existing Essbase Cubes(Understand cube structure and logic. Document all metadata and business rules. Prepare for migration or rebuild on modern platforms) Secondary Skills Conceptual knowledge of ETL and data warehousing, working knowledge is added advantage Basic knowledge of Java is added advantage. JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools. Provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Experience in Risk or finance technology area. Strong SQL knowledge, involving complex joins and analytical functions About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The joins job market in India is thriving with numerous opportunities for skilled professionals. Joins roles are in high demand across various industries, making it an attractive field for job seekers looking to build a career in data analysis and database management.
These major cities in India are actively hiring for joins roles, offering a wide range of opportunities for job seekers.
The average salary range for joins professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
A typical career progression in the joins field may involve starting as a Junior Database Analyst, advancing to a Database Developer, then moving up to a Senior Database Administrator, and finally reaching the position of a Database Architect or Data Engineer.
In addition to expertise in joins, professionals in this field are often expected to have skills in SQL query optimization, database design, data modeling, and ETL processes.
As you prepare for joins roles in India, remember to showcase your expertise in SQL and database management, along with related skills such as data modeling and query optimization. By honing your skills and confidently answering interview questions, you can position yourself as a strong candidate in the competitive job market. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |