Shaheen Akhtar has 3 years of experience as a Programmer Analyst at Cognizant Technologies working on projects for clients like Novartis and Sandoz. She has expertise in requirement gathering, data integration, Oracle, PLSQL, SQL tuning and quality assurance. Some of her responsibilities include creating views, stored procedures, APIs, validating data integrity, designing technical specifications and performing testing. She is proficient with technologies like TIBCO Data Virtualization, Oracle, SQL and tools like SQL Developer, Squirrel and Postman.
Paratus Systems, IT Services provider, caters to various IT Consulting and Services to esteemed clients. And happy to help and support our customer organizations as growth partner.
Progressive system engineer with 8 years of hands-on experience developing and implementing innovative software
products and solutions that significantly increase productivity and profitability. Adept at delivering high-quality products
while establishing solid analytical and problem solving abilities. Skilled using Core Java, PHP, OOP, Design Patterns,
SOA, Data Structure / Algorithms, JavaScript, jQuery, CSS, XML, HTML, JSON, MySQL, Oracle, and Informix while
leading comprehensive software development. Experienced in implementing application through entire Software
Development Life Cycle.
Paratus Systems, IT Services provider, caters to various IT Consulting and Services to esteemed clients. And happy to help and support our customer organizations as growth partner.
Progressive system engineer with 8 years of hands-on experience developing and implementing innovative software
products and solutions that significantly increase productivity and profitability. Adept at delivering high-quality products
while establishing solid analytical and problem solving abilities. Skilled using Core Java, PHP, OOP, Design Patterns,
SOA, Data Structure / Algorithms, JavaScript, jQuery, CSS, XML, HTML, JSON, MySQL, Oracle, and Informix while
leading comprehensive software development. Experienced in implementing application through entire Software
Development Life Cycle.
Progressive system engineer with 8 years of hands-on experience developing and implementing innovative software
products and solutions that significantly increase productivity and profitability. Adept at delivering high-quality products
while establishing solid analytical and problem solving abilities. Skilled using Core Java, PHP, OOP, Design Patterns,
SOA, Data Structure / Algorithms, JavaScript, jQuery, CSS, XML, HTML, JSON, MySQL, Oracle, and Informix while
leading comprehensive software development. Experienced in implementing application through entire Software
Development Life Cycle.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
1. SHAHEEN AKHTAR
Shaheenakhtar54321@gmail.com (+91)8908316720, 7978249698 Mahadevapura, Bangalore, KA-560048
EDUCATION
C.V.Raman College of Engineering, 2013-2017
Biju Patnaik University of Technology, Bhubaneswar, India
Bachelor of Technology in Information Technology
PROFESSIONAL SUMMARY
3 yearsof professional workexperiencein Development, TestingandImplementationof software applications.
Currently working with Cognizant Technologies since April 2018. My area of expertise is Requirement gathering,
Solution Architect for Data Integration, Development / Enhancement in Data Integration, Oracle, PLSQL,
SQL Tuning, Quality and Production support and Team Management.
Hands-on experience on Oracle PLSQL, SQL, SQUIRREL, TDV, Data Standardization also in API Layers using
REST/SOAP to deliver standardized data.
Knowledgeable Data Analyst skilled in data collection, analysis and management. Works well under pressure
and meets deadlines and targets while delivering high quality work.
A keen Analyst and Team Player withthorough understanding of all aspects of the SDLC from understanding
clientrequirementsthroughdirectclientinteraction,translatingthemintotechnical specificationsanddriving
their execution.
Excellent analytical, logical, programming and debugging skills.
EXPERIENCE
Programmer Analyst,CognizantTechnologySolutions,Bangalore, India April 2018- Present
Client:Novartis,Life ScienceDomain
Technology:TIBCO Data Virtualization,Oracle, SQL/PLSQL,Squirrel,Postman
Role: Data Engineer
Project Title: TechOpsQuality
It included many integration under this project like APQR, QANTO, QADM(Quality Analytics and Data
Management),Workedondatamanagementforclientandmodifiedthe dataaccordingtoclientsrequirement
withoutchangingthe source data, and creating virtual viewsusingTDV and publishingthemusingodataand
REST API’s i.e., GET and POST without hampering the maindata. Data Virtualization (DV12N) integratesdata
from multiple data sources and provide a single access point to the data consumers.
Technology Used: TIBCO Data Virtualization, Oracle PL/SQL, SQL, Squirrel, Postman
Responsibilities:
Create Views,PL/SQLPackages, andFunctionsforapplicationenhancements.
Createddata serviceslikeOData,ODBC,JDBCand webserviceslike RESTAPI’sGETand POST
Development of tables, indexes, views, triggers and other objects using TDV.
Created Definition sets, Procedures and XQuery transformation according to clients requirement.
PublishedasingleprocedureasGETAPItocall multipleproceduresusingcommoninputcolumnhaving
different values.
Appliedmulti-tablecachinginthe viewsfordailyrefreshof data.
Create DB Objectslike Tables,Indexes, andSynonyms etc.
Analyze,conceptualize andcreate endtoendTechnical Designandmappingdocuments.
DesignSpecification,PerformUnittesting ,IQSandUAT
2. Project Title:Novartis Axway Interface
Axway API Gateway manages, delivers, and secures enterprise APIs, applications, and consumers. Data
Virtualization (DV12N) integrates data from multiple data sources and provide a single access point to the data
consumers.
Technology Used: TDV,SQL Developer ,Postman, Oracle SQl developer, Squirrel
Responsibilities:
Analyzing existing system and impact of using TIBCO Data Virtualization(TDV also known as
COMPOSITE)
Design and Creation object structure in TDV (i.e., Composite tool)
Develop virtual Databases, establishing data source connections.
Create views on top of database tables and publish them as ODATA resource, ODBC, JDBC as well
Create Base Views, Shared Views and Published them as data services and web services
Understood the client requirement and made business logic according to client’s specification and
requirement.
Used APIGW’s help and provided secured url’s to client.
Used basic Authentication for delivering secured data to client.
Design Specification, Perform Unit Testing, IQS and User Acceptance Testing
Project Title : Sandoz GP
Government Pricing(GP) module is part of the Revitas Flex Suite. It houses all required domain and
transactional data for the GP calculation management, including the configuration of calculations, periodic
price generation, reporting and audit ability.
Data from various systems fed to the GP data repositories.Data integrity is validated.Pricing analysts set up
andmaintainprice type policies,calculations,reportingandhavethe abilitytosubmitpricingsfromthe GPFlex
module.
Technology Used: TIBCO Data Virtualization, Oracle PL/SQL, SQL, Postman
Responsibilities:
Create Views, PL/SQLPackages,Procedures, andFunctions forapplicationenhancements.
Developmentof triggersandviewsusingSQLDeveloper.
Create Db objectslike Tables,Indexes,Synonymsetc.
CreatedParameterizedQueryandPackagedQueryaccordingtothe client’srequirementandworked
on the businesslogictogive themthe requiredresult.
DesignSpecification,PerformUnittesting ,IQSandUAT
SKILLS
Databases:PLSQL Developer,Oracle 9i, Oracle
10g, Oracle 11g, Oracle 12C
Language:PL/SQL, TIBCO Data Virtualization
Applications:MSWord, MS Excel,MS Powerpoint
Methodologies:Agile,Waterfall, SDLC
Technologies:TIBCODataVirtualization,SQL,PL/SQL,
Oracle PL/SQL
Tools:TIBCO Studio, SQLDeveloper, Squirrel,Postman