Sourav Banerjee is a software developer with over 4 years of experience in banking, big data, and mainframe development. He has extensive skills in Java, COBOL, SQL, Hadoop, Hive, and Pig. His career includes projects involving data migration, log analysis, statement generation, and developing solutions for requirements from clients like ING Bank and Tata Consultancy Services. Sourav holds certifications in areas like financial markets, big data analytics, and data science.
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast on September 23, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=6cd94ed2ed7cc7090f7d5db1bf343438
Ask anyone who knows, and they’ll tell you candidly: traditional Master Data Management programs require not just tools, technologies and people, but also a level of cooperation and collaboration in the business that can be very difficult to manage. Many of the consequent hurdles that appear stem from long cycle times and lack of transparency into the creation and management of the rules that govern such programs. But now, the power of Hadoop 2.0 has opened up a very different method of action.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor who will explain how the Hadoop ecosystem, powered by YARN, can transform MDM from a segmented, often disjointed set of business processes, into a much tighter platform that can finally deliver on the original promise of the discipline. He’ll be briefed by George Corugedo of RedPoint Global, who will showcase his company’s unified data management platform, which weaves together the best practices of traditional MDM with the power and flexibility of Hadoop.
Visit InsideAnlaysis.com for more information.
Please download my Resume to know the work side of me...
Personal Hobbies and Areas of Study
Computer languages and architecture • Electronics • Chemistry • Physics • Nootropics • Quantum Mechanics • Philosophy • Psychology • All aspects of technology • CytptoCurrency [at one point currently running(at home) 2 terrahashs of ASIC SHA2d miners] • Algorithmic math • Robotics • Amazon EC3 Services • Much DOGE
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast on September 23, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=6cd94ed2ed7cc7090f7d5db1bf343438
Ask anyone who knows, and they’ll tell you candidly: traditional Master Data Management programs require not just tools, technologies and people, but also a level of cooperation and collaboration in the business that can be very difficult to manage. Many of the consequent hurdles that appear stem from long cycle times and lack of transparency into the creation and management of the rules that govern such programs. But now, the power of Hadoop 2.0 has opened up a very different method of action.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor who will explain how the Hadoop ecosystem, powered by YARN, can transform MDM from a segmented, often disjointed set of business processes, into a much tighter platform that can finally deliver on the original promise of the discipline. He’ll be briefed by George Corugedo of RedPoint Global, who will showcase his company’s unified data management platform, which weaves together the best practices of traditional MDM with the power and flexibility of Hadoop.
Visit InsideAnlaysis.com for more information.
Please download my Resume to know the work side of me...
Personal Hobbies and Areas of Study
Computer languages and architecture • Electronics • Chemistry • Physics • Nootropics • Quantum Mechanics • Philosophy • Psychology • All aspects of technology • CytptoCurrency [at one point currently running(at home) 2 terrahashs of ASIC SHA2d miners] • Algorithmic math • Robotics • Amazon EC3 Services • Much DOGE
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Epistemic Interaction - tuning interfaces to provide information for AI support
Sourav banerjee resume
1. SOURAV BANERJEE
Souravbanerjee423@gmail.com
9482400751/9619860924
CONTACT DETAILS:
Mob:-
9482400751/9619
860924
PERSONAL DATA:
Date of Birth:
14/03/1991
Gender: Male
Nationality:
Indian
Languages:
English, Hindi,
Bengali.
A dynamic & result-oriented Software Developer (Senior Software Engineer) with strong
background in Banking/Credit Card Domain on Hadoop /SPARK Technology and
HP Tandem/Cobol environment.
Also having Proven track record of 4+ years in diverse facets of software development,
Design, Data Analytics and execution of business applications. Also having knowledge on
Data Mining, Machine Learning and Data Science.
PROFESSIONAL CAREER DETAILS:
Domain Knowledge Wholesale Banking / Credit Card Domain.
Functional Areas Effective communicator with ability to convey ideas in
speaking and writing, Excellent analytical and decision
making skills. Have a good knowledge on wholesale
banking/Card (Switch) Concept.
TECHNICAL SKILL SET:
Languages JAVA (J2EE),PYTHON(BASIC),R, SQL, COBOL,C.
Utilities SPOOLCOM,BATCHCOM, GBS(ATLAS),BATCHCOM, PERUSE,
FUP, ENFORM, ,IDCAMS,DFSORT, FTP,INSPECT,TMF,HUE,
Cloudera Manager
Tools
Hadoop Ecosystem
IDE’s
Relational
Databases
STORQM, FRACT, RDDM , MS Excel, ULTRA EDIT, EXPEDITOR,
Streaming Platform, VMWARE
Map Reduce, Sqoop, Hive, Pig, Hbase, HDFS, Zookeeper
ECLIPS, Microsoft SQL server 2010,ATOM
Enscribe File System(SQL/MP), DB2,HDFS
No SQL Database
Framework
Application Servers
HBase
Hadoop Framework
HP TANDEM, IBM MAINFRAME Z/OS
2. Operating Systems
Scripting Language
Project
Management Tools
IT Operation
Standard
WINDOWS XP/7,UNIX/LINUX, Guardian(Non Stop Kernel),
Z/OS
Shell Script , Python(Basic)
HPSM, JIRA, SNOW
SCRUM and Agile Methodology
EDUCATIONAL BACKGROUND :
Bachelor of Technology.
Year of Passout :June ,2012
CAREER SUMMARY:
Proactive, flexible, customer focused, innovative with good analytical skills and
Can work under pressure and tight deadlines.
3 years of extensive experience in COBOL, TACL, TAL, UNIX, and IBM Mainframe
Systems and Tandem Development.
1+ years of experience as Hadoop developer having sound knowledge of Big Data
Technology Stack– Hadoop, Hive, HDFS, Mapreduce, Sqoop,PIG,Flume,Impala.
Strong Knowledge of Hadoop and Hive and Hive's analytical functions
Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce,
Flume, OoZie.
Implemented Proofs of Concept on Hadoop stack and different big data analytic tools,
migration from different databases (i.e Teradata, Oracle,MYSQL ) to Hadoop.
Successfully loaded files to Hive and HDFS from HBase
Efficient in building hive, pig and map Reduce scripts.
Loaded the dataset into Hive for ETL Operation.
Experience in using DBvisualizer, Zoo keeper and cloudera Manager.
Experience in database design using Stored Procedure, Functions, Triggers and strong
experience in writing complex queries for DB2, SQL Server.
Banking Knowledge particularly in the areas of payment, Interest Accruals, FX
And MM.
Credit Card Domain Knowledge especially in Issuing and Acquiring.
Has also experience in Testing (Unit testing, functional analysis testing and user
acceptance testing).
Ability to pay attention to detail and develop a bigger picture of problem.
Have a good experience on working in AGILE and Waterfall methodology.
Experience on working in SCRUM methodology
3. CERTIFICATION:
Certification on Basic of Financial Market – NCFM
Big Data Analytics with HDInsight: Hadoop on Azure – Microsoft
IBM Certified Big Data and Hadoop Developer - IBM
Certified Big data and Hadoop Developer - Simplilearn
Winner of TCS GEMS On-The-Spot Award
Refer to below mentioned LinkedIn profile for more details.
Certified Data Science – R Programming - Simplilearn
PROFESSIONAL EXPERIENCE:
Company Mashreq Global Service
(From Nov 2016 to Current)
Technical Role : Hadoop Developer
Working as development - production support executive in Card Domain Technology IN
issuing and acquiring module.
Install raw Hadoop and NoSQL applications and develop programs for analyzing data.
Responsibilities:
• Replaced default Derby metadata storage system for Hive with MySQL system.
• Executed queries using Hive and developed Map-Reduce jobs to analyze data.
• Developed Pig Latin scripts to extract the data from the web server output files
to load into HDFS.
• Developed the Pig UDF's to preprocess the data for analysis.
• Developed Hive queries for the analysts.
• Use Impala in place of Hive for some short application.
• Use Flume for Streaming Data.
• Involved in loading data from LINUX and UNIX file system to HDFS.
• Analyzed the web log data using the HiveQL.
Technology Used: Core Java, Apache Hadoop,HDFS, Pig, Hive, Cassandra, Shell Scripting,
My Sql, LINUX, UNIX
Tata Consultancy Service LTD
(From Oct 2012 to Aug 2016)
Worked as System Engineer for banking and financial service practice. In my tenure of
engagement with TCS, I was involved in performing activities like analyzing business
requirements, technical solution design; develop COBOL codes, quality assurance and
internal reviews.
4. Client:
ING Wholesale
Banking
Duration
3 year 4 months
(From May 13 –
Aug 2016)
Role:
System Engineer
Key Projects Undertaken:
ATLAS Data Analysis :
Technical Role : Hadoop Developer
Duration: 6 months.
Project Details: ING bank from Netherlands is one of the leading banks in the world.
ING migrated from Atlas to GBS in 2008; however regulatory requirements make it
mandatory to keep data for 10 years.
ING keeps the data for 20 years and with the Atlas servers decommissioned,
maintaining the servers to retain the data is very costly.
Thus all the data from Atlas server was decided to migrate to Hadoop , As the data is
highly structured the data was stored in Hive tables so that further analysis can be done
on the data and Regulatory/customer requirements can be met.
Responsibilities:
• Developed Hive scripts for end user / analyst requirements to perform
• ad hoc analysis
• Solved performance issues in Hive with understanding of Joins, Group and
aggregation and how does it translate to Map Reduce jobs.
• Created External/ Managed Hive tables with optimized partitioning and
bucketing.
• Data is partitioned on the basis of year, sub branch, currency and account type.
• Generate the statements for an account for a particular year.
•
Technology Used: COBOL, UNIX, JAVA, SQOOP, HIVE.
ING Security Monitoring:
Technical Role : Hadoop Developer
Duration: 2 months.
Project Details: Daily activities in the GBS environments are recorded in the log file.
These log files are stored as flat files. At the end of the day after the Pre-EOD run the log
files are FTPed to the local system and then stored in HDFS. The log files are analyzed
using MapReduce to check for any suspicious activity.
Responsibilities:
• Worked as lead developer payment module and Foreign Exchange module.
• Developed problem definition design for enabling project requirements.
• Developed MapReduce program to analyze the log files and check for any
suspicious activity.
• To perform team activities like internal code reviews and quality assurance
5. Technology Used : MQ, MapReduce, Hadoop Framework,Tandem,UNIX
Statement Generation:
Technical Role : Hadoop Developer
Duration: 2 months
Project Details: As an extension to the daily log analysis for security monitoring, hadoop
can be used to analyze the accounting entry records to generate the statements for
customers on daily basis.The generation of the statements makes up a large portion of
the daily EOD processing.
Responsibilities:
• Developing MapReduce program to read the accounting entries file and
generate the output such and send it PSE (Paper Statement Engine) for further
processing.
• Worked on whole design and development part of the project
• Develop new COBOL codes for new tags in SWIFT messaging.
• Designing application flow and user interaction
• Providing development solutions to meet end user requirements.
• Delivering code with defined standards within given timeframe
Technology Used: UNIX, Mapreduce, COBOL, Hive,SQOOP,HDFS.
ING Wholesale Bank: GBS LCM Release 2014
Technical Role : Tandem Developer
Duration: 8 months
Project Details: It was a combination of different small change requests (CR) which are
solely based on various customer requirement, such as Report Creation, Payment Flow
changes, Database Handling.
Technology used : IBM Mainframe, TANDEM, CICS,UNIX, JAVA
Responsibilities:
• Worked as lead developer
• Develop codes from the very scratch and perform small changes in existing
programs.
• Developing technical solution design for enabling customer requirements.
• Monitor day to day activities.
• To perform team activities like internal code reviews, time estimates and quality
assurance
6. Technology used : COBOL, MVS/JCL, TAL, UNIX,TANDEM.
ING Wholesale Bank: SWIFT 2013
Technical Role : Tandem Developer
Duration: 4 months
Project Details- It was based on updating of SWIFT messaging based on the new SWIFT
2013 manual. We had to add new tag to accommodate new facility to the customer.
Technology used : COBOL, MVS/JCL, TAL, UNIX,TANDEM.
Responsibilities:
• Worked on whole design and development part of the project
• Designing application flow and user interaction
• Providing development solutions to meet end user requirements.
• Delivering code with defined standards within given timeframe
ING Wholesale Bank: GMAINT 2013
Technical Role : Tandem Developer
Duration: 4 months
Project Details: In this project WSS (Wall Street System) deliver new components to
accommodate in the TANDEM server. We gather the requirement specifically for ING
and modify and update the WSS deliveries as per our own requirement.
Technology used : TANDEM, COBOL, JCL, and DB2
Responsibilities:
• Worked as developer
• Involved in Technical analysis part.
• Worked on section preparation and delivery part.
• Gathering requirements from onsite functional team
• Designing application flow and user interaction
• Worked on documentation part.
ING Wholesale Bank : GBS LCM Release 2013
Technical Role : Tandem Developer
Duration: 8 months
Project Details: It is a combination of different small change requests(CR) which are
7. Client:
TCS INTERNAL
Duration
4 months
(Jan 2013 – April
2013)
Role:
Software
developer
solely based on various customer requirement ,such as Report Creation, Payment Flow
changes, Database Handling.
Technology Used : IBM Mainframe, TANDEM, CICS,UNIX, JAVA
Responsibilities:
• Worked as developer.
• Develop codes from the very scratch and perform small changes in existing
programs.
• Developing technical solution design for enabling customer requirements.
• Monitor day to day activities.
Non Life Insurance First Quote Generation (Jan 2013 to Apr 2013)
Technical Role : Tandem Developer
Duration: 4 months
Project Details: The Non-Life insurance Quote generation system or Non-life Fast Quote
generation system need to estimate the premium amount for an applicant based on
the given factors including type of insurance, coverage amount, length of coverage, age,
gender, driving history, health and medical history, family history, vehicle history and
approximate rating class. The calculated premium should be displayed on the screen
and the quote has to be saved in the system database after the completion of the
transaction
Technology used : COBOL, JCL, and DB2,IBM Mainframe,UNIX
Responsibilities:
• Developing functional solution design as per business requirement
• To develop COBOL codes and JCL jobs from scratch
• Creating MAPS using CICS for UI purpose.
• Developing technical solution design for enabling customer requirements.
• To perform team activities like time estimates and quality assurance of
documents
REFERENCES
References Available upon Request
LinkedIn Profile : https://www.linkedin.com/in/sourav-banerjee-50b443106/
DECLARATION
I hereby declare that the above-mentioned information is correct up to my knowledge and I bear the
responsibility for the correctness of the above-mentioned particulars.