Hacks, Habits and Helpful Hints : The salesforce Admin's reference guide. This short guide explain how to use the salesforce data loader in a command line; No more clics, no more errors.
Gaining the Knowledge of the Open Data Protocol (OData) - Prairie Dev ConWoodruff Solutions LLC
The Open Data Protocol (OData) is an open protocol for sharing data. It provides a way to break down data silos and increase the shared value of data by creating an ecosystem in which data consumers can interoperate with data producers in a way that is far more powerful than currently possible, enabling more applications to make sense of a broader set of data. Every producer and consumer of data that participates in this ecosystem increases its overall value.
OData is consistent with the way the Web works – it makes a deep commitment to URIs for resource identification and commits to an HTTP-based, uniform interface for interacting with those resources (just like the Web). This commitment to core Web principles allows OData to enable a new level of data integration and interoperability across a broad range of clients, servers, services, and tools.
Gaining the Knowledge of the Open Data Protocol (OData) - Prairie Dev ConWoodruff Solutions LLC
The Open Data Protocol (OData) is an open protocol for sharing data. It provides a way to break down data silos and increase the shared value of data by creating an ecosystem in which data consumers can interoperate with data producers in a way that is far more powerful than currently possible, enabling more applications to make sense of a broader set of data. Every producer and consumer of data that participates in this ecosystem increases its overall value.
OData is consistent with the way the Web works – it makes a deep commitment to URIs for resource identification and commits to an HTTP-based, uniform interface for interacting with those resources (just like the Web). This commitment to core Web principles allows OData to enable a new level of data integration and interoperability across a broad range of clients, servers, services, and tools.
Take Security to the Next Level w/ Lightning Login Salesforce Admins
Tired of passwords? With Lightning Login users can ditch their password to make logging in more convenient AND more secure! Join our webinar to learn about Lightning Login and other free features like Security Health Check that you can use to be a security-minded #AwesomeAdmin. We’ll also cover what’s new for security in the latest releases and ways to improve the security of your instance in a few simple steps.
This webinar is for any Salesforce Admin who appreciates the power of custom page layouts in Classic and is excited to learn the possibilities in Lightning Experience. Let’s open an investigation into page layouts in Lightning Experience, inspecting the source for each panel and component to ensure that we’re making the most of new features - and still leveraging the work already done in Classic.
The AppExchange is full of apps and components that you can install in your Salesforce org to add specific functionality. Join us to learn how to create a strategy for using the AppExchange that will make you look like a rockstar and add real value for your company.
Every Admin can benefit from knowing a little code! Join us to learn why it’s so beneficial to know how to code in the “Lightning” era, how to write your first Apex trigger, and your specific steps to continue learning Apex.
As you put together your rollout plan for Lightning Experience, consider taking these tried and proven steps to ensure success.
Learn how the Admin team at Salesforce planned and successfully rolled out Lightning Experience.
Your admin toolbelt is not complete without Salesforce DXDaniel Stange
"Learn to Leverage and Love the aDmin eXperience"
Presentation by Christian Szandor Knapp and Daniel Stange, given at French Touch Dreamin' in Paris, November 14th, 2018
This is an enterprise plugin for DB2 monitoring. It is able to check the status of an instance connection and report it back to pandora. For more information visit the following webpage: http://pandorafms.com/index.php?sec=Library&sec2=repository&lng=es&action=view_PUI&id_PUI=396
Shared Coursework in Cyber Security Instructions Manual .docxedgar6wallace88877
Shared Coursework in Cyber Security
Instructions Manual
CybSec is an e-commerce company who sells products online. To support online
payments, CybSec has designed a network infrastructure illustrated in Figure 1.
This infrastructure includes an OuterFirewall, which controls incoming/outgoing
traffic, a DeMilitarized Zone (DMZ), where services are running, an
InnerFirewall, which controls incoming/outgoing traffic within the internal
network of the company.
Figure 1. Network Infrastructure of CybSec Company
In DMZ, several services run (SQL, Mail, Web etc.). Customers are accessing the
Web service to search for a product to buy. Upon decision, they enter personal
information (name, home address and card number) to buy products. Personal
information is stored in an SQL database.
Goal: Retrieve the credit card secret codes / owners names from the SQL
database.
Assumptions: You are an investigator/ethical hacker and your operating system
(OS) is Kali Linux.
What to do: Follow the steps provided to achieve your goals. Appendixes contain
information for linux and penetration testing commands that you can use to
achieve your goal.
How to access Investigators’s machine:
Start VMware Horizon View Client
Double click on the server icon named
nsq623ap.enterprise.internal.city.ac.uk.
Enter your login/password information (Same credentials as in City
University account).
If the VMWare Horizon Client is unavailable, use the RDP Client on City
University PC and Enter your VDI Name and enter your city login details
to login to the VDI. The VDI Assignment table is in the Coursework folder
on Moodle.
Double click cybsecX to load your Win 7 environment.
Once Win 7 loads double click Putty found in your desktop. Double click
the Investigator option in saved sessions menu and connect to Kali Linux
OS.
Login to Kali Linux OS
You are now in the Investigator’s machine. This is your environment to
perform attacks.
Attacking the system by finding SSH admin’s credentials:
1) Use Linux Terminal (similar to CMD in Win) to enter penetration testing
commands.
2) Investigate which network your Linux host belongs to (“Ifconfig”).
3) Discover your network and services running (“nmap”).
4) To access the discovered services found in DMZ, you will need to “guess”
admin’s password. Check if you can perform dictionary attack on the SSH service
and grant access. Dictionary files are located at “/root/bin/users.txt” for username
and “/root/bin/pass.txt” for passwords.
5) Establish remote connection to the DMZ server using SSH and the
credentials found in step 4.
Attacking the system by finding admin’s credentials in the SQL service:
6) After remote login to the DMZ server explore the folders to find anything
useful to attack to the SQL database. The Mailbox folders contain an
encrypted email which is located at “/usr/home/<username>/Maildir”. Bre.
BP501 - Building and deploying custom IBM sametime connect client installatio...Carl Tyler
IBM Sametime Connect is a powerful unified communications client, offering real-time communications capabilities. In this session, we'll cover how to build custom IBM Sametime installation packages, how to include interim fixes in the installation. We'll also cover how to customize various aspects of the client install with the installer, and how to ensure the install and uninstall is configured correctly. We'll also explain how you can manage IBM Sametime settings from the server post installation.
Presented by Carl Tyler of Epilio at IBM Connection 2014
The focus of the presentation is on organizing your PHP app build process, employing continuous testing, JS testing, automatic documentation, software metrics and other tools. The end result is expected to be a more stable, reliable, documented and healthy code base.
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
SOCRadar Research Team: Latest Activities of IntelBrokerSOCRadar
The European Union Agency for Law Enforcement Cooperation (Europol) has suffered an alleged data breach after a notorious threat actor claimed to have exfiltrated data from its systems. Infamous data leaker IntelBroker posted on the even more infamous BreachForums hacking forum, saying that Europol suffered a data breach this month.
The alleged breach affected Europol agencies CCSE, EC3, Europol Platform for Experts, Law Enforcement Forum, and SIRIUS. Infiltration of these entities can disrupt ongoing investigations and compromise sensitive intelligence shared among international law enforcement agencies.
However, this is neither the first nor the last activity of IntekBroker. We have compiled for you what happened in the last few days. To track such hacker activities on dark web sources like hacker forums, private Telegram channels, and other hidden platforms where cyber threats often originate, you can check SOCRadar’s Dark Web News.
Stay Informed on Threat Actors’ Activity on the Dark Web with SOCRadar!
Custom Healthcare Software for Managing Chronic Conditions and Remote Patient...Mind IT Systems
Healthcare providers often struggle with the complexities of chronic conditions and remote patient monitoring, as each patient requires personalized care and ongoing monitoring. Off-the-shelf solutions may not meet these diverse needs, leading to inefficiencies and gaps in care. It’s here, custom healthcare software offers a tailored solution, ensuring improved care and effectiveness.
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Salesforce Admin's guide : the data loader from the command line
1.
2. 1. Using Data Loader from the command line - Salesforce
Using the Salesforce Data Loader is convenient but laborious ...
Convenient because it is possible to retrieve any Salesforce database object : contacts,
customers, prospects, current opportunities, active users ...
But the handling is long because you only have to repeat orders in the GUI : copy / paste ID
and password, scan existing objects, select one, validate, select fields to export, possibly
building filters, validate, revalidate and produce the CSV file under the directory you
precised. Things are repeated more or less for other manipulations such as insert, update
and upsert adding the choice of the mapping file. And this sometimes several times per day.
So little interest to the user that he will see from a benevolent eye the automation off all
this. Good new, the use of Salesforce Data Loader in text mode is possible (-:
You just need to know if you connect to the internet via a proxy or live.
2. Start by creating a new directory on your PC
Create SF subdirectory D:Data(To change depending on your PC configuration, yours
needs)
Go to C:Program Files (x86)salesforce.comData Loader (possibly adjusted according to the
installation directory of your data loader)
Copy the bin directories, Java, Samples and dataloader-XX.0.0 uber.jar to D:DataSF. These
directories are used regardless of the type of operation performed: insert, export ...
Create directory export D:DataSFexport, D:DataSFimport, D:DataSFupdate,
D:DataSFupsert
3. Create three empty files :
key.txt, config.properties, process-conf.xml (confirm properties and xml extensions despite
the Windows warning messages). Place them in D:DataSFexport
These directory and files are used for a single operation: export Salesforce data.
The principle is to separate the files containing the SQL query to export (Salesforce's called
SOQL) and the file with passwords for maintenance issues. When you change your password,
there is only one place to change them!
4. Configure the files:
We first set config.properties:
#Loader Config
3. #03/04/2015
#sfdc.debugMessages=true
#sfdc.debugMessagesFile = D:Donneessfinsert1AccountExport.log these 2 lines are comments
process.encryptionKeyFile=D:DonneesSFinsertkey.txt #way to your key.txt file don’t forget the
sfdc.endpoint=https://login.salesforce.com
sfdc.username=your_salesforce_email #No « « no ‘’
sfdc.password=To-INQUIRE we will come back to this
sfdc.proxyUsername=ID proxy
sfdc.proxyPassword= To-INQUIRE we will come back to this
sfdc.proxyHost= proxy parameters see in your internet brownser
sfdc.proxyPort= proxy parameters see in your internet brownser
sfdc.loadBatchSize=100
sfdc.timeoutSecs=600
This is where things get complicated a bit. Passwords must be encrypted. For this, Salesforce
provides an encryption program. (you copy the bin directory did you ?)
You must start the Windows’s console (under Start Menu, Search programs and files, type
cmd).
Step 1: Move to the bin directory with the command cd "cd D:DataSFbin" (remember DOS
commands ?). Type the following command : encrypt.bat
TheTextYouWantImagineAnythingYouWant -g
Figure 2 CMD window in Windows 7
Copy and paste the result of the command in the encryption key.txt file. no need to put "".
Save changes. Pay attention to space, no spKey.txt close the file. End of the step 1.
4. Step 2: Always in the console, always under D:DataSFbin, run the following command
encrypt.bat YourProxipassword –e "D:DataSFimportkey.txt".
Step 3: Password encryption password Salesforce + token concatenated without spaces.
Always in the console, always under D:DataSFbin, run the following command encrypt.bat
YourSalesforcePasswordWith theEncryptedToken –e " D:DataSFexportkey.txt". Set aside
the encrypted password generated.
Step 4: Open the config.propertie file. The goal is to complete all the parameters with the
encrypted passwords obtained.
1. Your ID recognized by the proxy
2. Your CDMrecognized by the proxy and encrypted now
3. Your Login Salesforce
4. Your MDP + Salesforce token concatenated and encrypted now
Save changes.
Work on the config.properties file is finished.
We configure process-conf.xml by copying this in-process conf.xml
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-
beans.dtd">
<beans>
<!--
oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
oooooooooooooooooooooooooooooooooooo
ooooooooooooooooooooooooooo PREMIER EXTRACT
oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo -->
<!-- RESTE A METTRE LA DATE DANS LE FICHIER EN SORTIE ET REGLER LES REPERTOIRES pour
SAUVEGARDER -->
<!-- LES CSV Exports a deposer dans les repertoires habituels D:X_RepDepotFiles_TmpMAJ Salesforce_
Extract Cptes Distrib_Users
puis creer les .bat et la doc d'installation -->
<bean id="ExtractUser"
class="com.salesforce.dataloader.process.ProcessRunner"
singleton="false">
<description>ExtractUser job gets User info from salesforce and saves info into a CSV file.</description>
<property name="name" value="ExtractUser"/>
5. <property name="configOverrideMap">
<map>
<!-- Attention pour le moment 12/02/2015, concernant les commentaires les caracteres accentues ne passent
pas dans l'export et provoquent des erreurs
Ligne de commande process.bat "D:DonneesSFtest" ExtractUser -->
<!-- Fichiers de logs -->
<entry key="sfdc.debugMessages" value="true"/>
<entry key="sfdc.debugMessagesFile" value="D:DonneesSFtestlogExtractUser.log"/>
<!-- Parametres extraction Salesforce idema config du dataloader-->
<entry key="sfdc.timeoutSecs" value="600"/>
<entry key="sfdc.loadBatchSize" value="200"/>
<entry key="sfdc.entity" value="Account"/>
<entry key="sfdc.extractionRequestSize" value="500"/>
<entry key="sfdc.entity" value="User"/>
<!-- requete SOQL-->
<entry key="sfdc.extractionSOQL" value="Select Id, LastName, FirstName, Username, CommunityNickname,
CompanyName, Division, Department, Title, City, Email, IsActive, UserRoleId, ProfileId, UserType,
DelegatedApproverId, ManagerId, LastLoginDate, CreatedDate, LastModifiedDate, DesactivOrderDat__c,
CreatedById, LastModifiedById, IsPortalEnabled, PerId__c, Agence__c, EmployeeNum__c, BU__c,
Pointdevente__c, Brand__c, Username__c, LDAPLogin__c, RegDCDiv__c, SousRegion__c, Marche__c,
CompanyName__c, PerformId__c, resp_workflow_dae__c, MultiPDV__c, CodeEnseigne__c FROM User"/>
<!-- Nom du process, pour nous extract, possibilite de faire toutes les autres operation a condition d'avoir
fichier sdl et regler-->
<entry key="process.operation" value="extract"/>
<!-- reglage export, fichier sortie, et logs en cas d'erreur -->
<entry key="dataAccess.type" value="csvWrite"/>
<entry key="dataAccess.readUTF8" value="true"/>
<entry key="dataAccess.writeUTF8" value="true"/>
<entry key="dataAccess.name" value="D:Donnees_ Extract Cptes Distrib_UsersUser_Salesforce.csv"/>
<entry key="process.outputError" value="D:DonneesSFtestlogErrorExtractUser.csv"/>
</map>
6. </property>
</bean>
</beans>
Some explanations. The most important part of the file is the following entry:
<entry key = "sfdc.extractionSOQL" value = "Select ID, LastName, FirstName, Username,
CommunityNickname, CompanyName, Division, Department, Title, City, Email, IsActive,
UserRoleId, ProfileId, UserType, DelegatedApproverId, ManagerID, LastLoginDate,
CreatedDate , LastModifiedDate, DesactivOrderDat__c, CreatedById, LastModifiedById,
IsPortalEnabled, PerId__c, Agence__c, EmployeeNum__c, BU__c, Pointdevente__c,
Brand__c, Username__c, LDAPLogin__c, RegDCDiv__c, SousRegion__c, Marche__c,
CompanyName__c, PerformId__c, resp_workflow_dae__c, MultiPDV__c, CodeEnseigne__c
FROM User "/>
This part is the SOQL order. It could for example be adapted as follows according your needs
:
<entry key = "sfdc.extractionSOQL" value = "Select * FROM User" />
Just copy and paste your usual application as it appears in your Data Loader window
between characters "."
There is a second important input is <bean id = "ExtractUser" we will return.
5. Creating execution file
We reach the goal, go to the office, export.bat create the file, copy the following lines.
echo off
cd "D:DataSFbin"
process.bat start "D:Datasfexport" ExtractUser this tell you something?
6. Operation:
Double click on the file export.bat. After some time, you will have the query result in a cvs
file located under D:Data_ Extract CPTES Distrib_Users
Then simply to provide a task to run automatically, for example to recover your PC to start
the file without anything to do.
7. Futur
7. It is possible to export more by .XML file. It is possible to do insert, update and upsert, any
operation you do with the data loader. We will return if you suggest us to.
Thanks !