InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil

iMasters
iMastersjornalista, web editor, web writer, tradutora (en-pt/pt-en) at iMasters
PREMATURE
OPTIMIZATION
The Root of ALL Evil
@akitaonrails
(CODE VERSION)
@akitaonrails
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
manga-downloadr 
-u http://www.mangareader.net/onepunch-man 
-d /tmp/manga/one-punch-man 
-n one-punch-man
#!/usr/bin/env	ruby	
$LOAD_PATH.unshift	File.join(File.dirname(__FILE__),	'..',	'lib')	
require	'optparse'	
options	=	{	test:	false	}	
option_parser	=	OptionParser.new	do	|opts|	
		opts.banner	=	"Usage:	manga-downloadr	[options]"	
		opts.on("-t",	"--test",	"Test	routine")	do	|t|	
				options[:url]	=	"http://www.mangareader.net/onepunch-man"	
				options[:name]	=	"one-punch-man"	
				options[:directory]	=	"/tmp/manga-downloadr/one-punch-man"	
				options[:test]	=	true	
		end	
		opts.on("-u	URL",	"--url	URL",	
				"Full	MangaReader.net	manga	homepage	URL	-	required")	do	|v|	
				options[:url]	=	v	
		end	
		opts.on("-n	NAME",	"--name	NAME",	
				"slug	to	be	used	for	the	sub-folder	to	store	all	manga	files	-	required")	do	|n|	
				options[:name]	=	n	
		end	
		opts.on("-d	DIRECTORY",	"--directory	DIRECTORY",	
				"main	folder	where	all	mangas	will	be	stored	-	required")	do	|d|	
				options[:directory]	=	d	
		end	
		opts.on("-h",	"--help",	"Show	this	message")	do	
				puts	opts	
				exit	
		end	
end
require	'manga-downloadr'	
generator	=	MangaDownloadr::Workflow.create(options[:url],	options[:name],	
options[:directory])	
		generator.fetch_chapter_urls!	
		generator.fetch_page_urls!	
		generator.fetch_image_urls!	
		generator.fetch_images!	
		generator.compile_ebooks!
require	'manga-downloadr'	
generator	=	MangaDownloadr::Workflow.create(options[:url],	options[:name],	
options[:directory])	
		puts	"Massive	parallel	scanning	of	all	chapters	"	
		generator.fetch_chapter_urls!	
		puts	"nMassive	parallel	scanning	of	all	pages	"	
		generator.fetch_page_urls!	
		puts	"nMassive	parallel	scanning	of	all	images	"	
		generator.fetch_image_urls!	
		puts	"nTotal	page	links	found:	#{generator.chapter_pages_count}"	
		puts	"nMassive	parallel	download	of	all	page	images	"	
		generator.fetch_images!	
		puts	"nCompiling	all	images	into	PDF	volumes	"	
		generator.compile_ebooks!	
puts	"nProcess	finished."
require	'manga-downloadr'	
generator	=	MangaDownloadr::Workflow.create(options[:url],	options[:name],	
options[:directory])	
unless	generator.state?(:chapter_urls)	
		puts	"Massive	parallel	scanning	of	all	chapters	"	
		generator.fetch_chapter_urls!	
end	
unless	generator.state?(:page_urls)	
		puts	"nMassive	parallel	scanning	of	all	pages	"	
		generator.fetch_page_urls!	
end	
unless	generator.state?(:image_urls)	
		puts	"nMassive	parallel	scanning	of	all	images	"	
		generator.fetch_image_urls!	
		puts	"nTotal	page	links	found:	#{generator.chapter_pages_count}"	
end	
unless	generator.state?(:images)	
		puts	"nMassive	parallel	download	of	all	page	images	"	
		generator.fetch_images!	
end	
unless	options[:test]	
		puts	"nCompiling	all	images	into	PDF	volumes	"	
		generator.compile_ebooks!	
end	
puts	"nProcess	finished."
MangaDownloadr::Workflow
MangaDownloadr::Workflowmodule	MangaDownloadr	
		ImageData	=	Struct.new(:folder,	:filename,	:url)	
		class	Workflow	
				def	initialize(root_url	=	nil,	manga_name	=	nil,	manga_root	=	nil,	options	=	{})	
				end	
				def	fetch_chapter_urls!	
				end	
				def	fetch_page_urls!	
				end	
				def	fetch_image_urls!	
				end	
				def	fetch_images!	
				end	
				def	compile_ebooks!	
				end	
				def	state?(state)	
				end	
				private	
				def	current_state(state)	
				end	
		end	
end
fetch_chapter_urls!
module	MangaDownloadr	
		ImageData	=	Struct.new(:folder,	:filename,	:url)	
		class	Workflow	
				def	initialize(root_url	=	nil,	manga_name	=	nil,	manga_root	=	nil,	options	=	{})	
				end	
				def	fetch_chapter_urls!	
				end	
				def	fetch_page_urls!	
				end	
				def	fetch_image_urls!	
				end	
				def	fetch_images!	
				end	
				def	compile_ebooks!	
				end	
				def	state?(state)	
				end	
				private	
				def	current_state(state)	
				end	
		end	
end
fetch_chapter_urls!
fetch_chapter_urls!def	fetch_chapter_urls!	
		doc	=	Nokogiri::HTML(open(manga_root_url))	
		self.chapter_list	=	doc.css("#listing	a").map	{	|l|	l['href']}	
		self.manga_title		=	doc.css("#mangaproperties	h1").first.text	
		current_state	:chapter_urls	
end
fetch_chapter_urls!def	fetch_chapter_urls!	
		doc	=	Nokogiri::HTML(open(manga_root_url))	
		self.chapter_list	=	doc.css("#listing	a").map	{	|l|	l['href']}	
		self.manga_title		=	doc.css("#mangaproperties	h1").first.text	
		current_state	:chapter_urls	
end
def	fetch_page_urls!	
		chapter_list.each	do	|chapter_link|	
						response	=	Typhoeus.get	"http://www.mangareader.net#{chapter_link}"	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		chapter_list.each	do	|chapter_link|	
				begin	
						response	=	Typhoeus.get	"http://www.mangareader.net#{chapter_link}"	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
				rescue	=>	e	
						puts	e	
				end	
		end	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_link|	
				begin	
						request	=	Typhoeus::Request.new	"http://www.mangareader.net#{chapter_link}"	
						request.on_complete	do	|response|	
								begin	
										chapter_doc	=	Nokogiri::HTML(response.body)	
										pages	=	chapter_doc.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
										chapter_pages.merge!(chapter_link	=>	pages.map	{	|p|	p['value']	})	
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors	<<	{	url:	chapter_link,	error:	e,	body:	response.body	}	
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty?	
				puts	"n	Errors	fetching	page	urls:"	
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_pages.values.inject(0)	{	|total,	list|	total	+=	list.size	}	
		current_state	:page_urls	
end
def	fetch_page_urls!	
		hydra	=	Typhoeus::Hydra.new(max_con
		chapter_list.each	do	|chapter_link|
				begin	
						request	=	Typhoeus::Request.new
						request.on_complete	do	|respons
								begin	
										chapter_doc	=	Nokogiri::HTM
										pages	=	chapter_doc.xpath("
										chapter_pages.merge!(chapte
										print	'.'	
								rescue	=>	e	
										self.fetch_page_urls_errors
										print	'x'	
								end	
						end	
						hydra.queue	request	
				rescue	=>	e	
						puts	e	
				end	
		end	
		hydra.run	
		unless	fetch_page_urls_errors.empty
				puts	"n	Errors	fetching	page	url
				puts	fetch_page_urls_errors	
		end	
		self.chapter_pages_count	=	chapter_
		current_state	:page_urls	
end
def	fetch_image_urls!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each	do	|chapter_key|	
				chapter_pages[chapter_key].each	do	|page_link|	
						begin	
								request	=	Typhoeus::Request.new	"http://www.mangareader.net#{page_link}"	
								request.on_complete	do	|response|	
										begin	
												chapter_doc	=	Nokogiri::HTML(response.body)	
												image							=	chapter_doc.css('#img').first	
												tokens						=	image['alt'].match("^(.*?)s-s(.*?)$")	
												extension			=	File.extname(URI.parse(image['src']).path)	
												chapter_images.merge!(chapter_key	=>	[])	if	chapter_images[chapter_key].nil?	
												chapter_images[chapter_key]	<<	ImageData.new(	tokens[1],	"#{tokens[2]}#{extension}",	image['src']	)	
												print	'.'	
										rescue	=>	e	
												self.fetch_image_urls_errors	<<	{	url:	page_link,	error:	e	}	
												print	'x'	
										end	
								end	
								hydra.queue	request	
						rescue	=>	e	
								puts	e	
						end	
				end	
		end	
		hydra.run	
		unless	fetch_image_urls_errors.empty?	
				puts	"nErrors	fetching	image	urls:"	
				puts	fetch_image_urls_errors	
		end	
		current_state	:image_urls	
end
def	fetch_images!	
		hydra	=	Typhoeus::Hydra.new(max_concurrency:	hydra_concurrency)	
		chapter_list.each_with_index	do	|chapter_key,	chapter_index|	
				chapter_images[chapter_key].each	do	|file|	
								downloaded_filename	=	File.join(manga_root_folder,	file.folder,	file.filename)	
								next	if	File.exists?(downloaded_filename)	#	effectively	resumes	the	download	list	without	re-downloading	eve
								request	=	Typhoeus::Request.new	file.url	
								request.on_complete	do	|response|	
										begin	
												#	download	
												FileUtils.mkdir_p(File.join(manga_root_folder,	file.folder))	
												File.open(downloaded_filename,	"wb+")	{	|f|	f.write	response.body	}	
												unless	is_test	
														#	resize	
														image	=	Magick::Image.read(	downloaded_filename	).first	
														resized	=	image.resize_to_fit(600,	800)	
														resized.write(	downloaded_filename	)	{	self.quality	=	50	}	
														GC.start	#	to	avoid	a	leak	too	big	(ImageMagick	is	notorious	for	that,	specially	on	resizes)	
												end	
												print	'.'	
										rescue	=>	e	
												self.fetch_images_errors	<<	{	url:	file.url,	error:	e	}	
												print	'#'	
										end	
								end	
						hydra.queue	request	
				end	
		end	
		hydra.run	
		unless	fetch_images_errors.empty?	
				puts	"nErrors	downloading	images:"	
				puts	fetch_images_errors	
		end	
		current_state	:images	
end
def	compile_ebooks!	
		folders	=	Dir[manga_root_folder	+	"/*/"].sort_by	{	|element|	ary	=	element.split("	").last.to_i	}	
		self.download_links	=	folders.inject([])	do	|list,	folder|	
				list	+=	Dir[folder	+	"*.*"].sort_by	{	|element|	ary	=	element.split("	").last.to_i	}	
		end	
		#	concatenating	PDF	files	(250	pages	per	volume)	
		chapter_number	=	0	
		while	!download_links.empty?	
				chapter_number	+=	1	
				pdf_file	=	File.join(manga_root_folder,	"#{manga_title}	#{chapter_number}.pdf")	
				list	=	download_links.slice!(0..pages_per_volume)	
				Prawn::Document.generate(pdf_file,	page_size:	page_size)	do	|pdf|	
						list.each	do	|image_file|	
								begin	
										pdf.image	image_file,	position:	:center,	vposition:	:center	
								rescue	=>	e	
										puts	"Error	in	#{image_file}	-	#{e}"	
								end	
						end	
				end	
				print	'.'	
		end	
		current_state	:ebooks	
end
manga-downloadr -t
199.69s user 10.30s system 124% cpu 2:48.14 total
manga-downloadr -t
199.69s user 10.30s system 124% cpu 2:48.14 total
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
mix.exs
mix.exsdefmodule	ExMangaDownloadr.Mixfile	do	
		use	Mix.Project	
		def	project	do	
				[app:	:ex_manga_downloadr,	
					version:	"1.0.1",	
					elixir:	"~>	1.1",	
					build_embedded:	Mix.env	==	:prod,	
					start_permanent:	Mix.env	==	:prod,	
					escript:	[main_module:	ExMangaDownloadr.CLI],	
					deps:	deps]	
		end	
		#	Configuration	for	the	OTP	application	
		#	
		#	Type	"mix	help	compile.app"	for	more	information	
		def	application	do	
				[applications:	[:logger,	:httpotion,	:porcelain],	
					mod:	{PoolManagement,	[]}]	
		end	
	defp	deps	do	
				[	
						{:ibrowse,	"~>	4.2.2"},	
						{:httpotion,	"~>	3.0.0"},	
						{:floki,	"~>	0.9.0"},	
						{:porcelain,	"~>	2.0.1"},	
						{:poolboy,	"~>	1.5.1"},	
						{:mock,	"~>	0.1.3",	only:	:test}	
				]	
		end	
end
PoolManagement
PoolManagement
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
pool_management.ex
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
pool_management.ex
pool_management.exdefmodule	PoolManagement	do	
		use	Application	
		def	start(_type,	_args)	do	
				PoolManagement.Supervisor.start_link	
		end	
end
Supervisor
Supervisor
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
supervisor.ex
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
supervisor.ex
supervisor.exdefmodule	PoolManagement.Supervisor	do	
		use	Supervisor	
		def	start_link	do	
				Supervisor.start_link(__MODULE__,	[])	
		end	
		def	init([])	do	
				pool_size	=	System.get_env("POOL_SIZE")	||	"50"	
				pool_options	=	[	
						name:	{:local,	:worker_pool},	
						worker_module:	PoolManagement.Worker,	
						size:	String.to_integer(pool_size),	
						max_overflow:	0	
				]	
				children	=	[	
						supervisor(Task.Supervisor,	[[name:	Fetcher.TaskSupervisor,	
strategy:	:transient,	max_restarts:	10]]),	
						:poolboy.child_spec(:worker_pool,	pool_options,	[])	
				]	
				supervise(children,	strategy:	:one_for_one)	
		end	
end
Worker
supervisor.exdefmodule	PoolManagement.Supervisor	do	
		use	Supervisor	
		def	start_link	do	
				Supervisor.start_link(__MODULE__,	[])	
		end	
		def	init([])	do	
				pool_size	=	System.get_env("POOL_SIZE")	||	"50"	
				pool_options	=	[	
						name:	{:local,	:worker_pool},	
						worker_module:	PoolManagement.Worker,	
						size:	String.to_integer(pool_size),	
						max_overflow:	0	
				]	
				children	=	[	
						supervisor(Task.Supervisor,	[[name:	Fetcher.TaskSupervisor,	
strategy:	:transient,	max_restarts:	10]]),	
						:poolboy.child_spec(:worker_pool,	pool_options,	[])	
				]	
				supervise(children,	strategy:	:one_for_one)	
		end	
end
Worker
supervisor.exdefmodule	PoolManagement.Supervisor	do	
		use	Supervisor	
		def	start_link	do	
				Supervisor.start_link(__MODULE__,	[])	
		end	
		def	init([])	do	
				pool_size	=	System.get_env("POOL_SIZE")	||	"50"	
				pool_options	=	[	
						name:	{:local,	:worker_pool},	
						worker_module:	PoolManagement.Worker,	
						size:	String.to_integer(pool_size),	
						max_overflow:	0	
				]	
				children	=	[	
						supervisor(Task.Supervisor,	[[name:	Fetcher.TaskSupervisor,	
strategy:	:transient,	max_restarts:	10]]),	
						:poolboy.child_spec(:worker_pool,	pool_options,	[])	
				]	
				supervise(children,	strategy:	:one_for_one)	
		end	
end
Worker
supervisor.exdefmodule	PoolManagement.Supervisor	do	
		use	Supervisor	
		def	start_link	do	
				Supervisor.start_link(__MODULE__,	[])	
		end	
		def	init([])	do	
				pool_size	=	System.get_env("POOL_SIZE")	||	"50"	
				pool_options	=	[	
						name:	{:local,	:worker_pool},	
						worker_module:	PoolManagement.Worker,	
						size:	String.to_integer(pool_size),	
						max_overflow:	0	
				]	
				children	=	[	
						supervisor(Task.Supervisor,	[[name:	Fetcher.TaskSupervisor,	
strategy:	:transient,	max_restarts:	10]]),	
						:poolboy.child_spec(:worker_pool,	pool_options,	[])	
				]	
				supervise(children,	strategy:	:one_for_one)	
		end	
end
Worker
Worker
Supervisor
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
worker.ex
Supervisor
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
worker.ex
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
worker.exdefmodule	PoolManagement.Worker	do	
		use	GenServer	
		#	Public	APIs	
		def	index_page(url,	source)	do	
		end	
		def	chapter_page([chapter_link,	source])	do	
		end	
		def	page_image([page_link,	source])	do	
		end	
		def	page_download_image(image_data,	directory)	do	
		end	
		#	internal	GenServer	implementation	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_image,	page_link,	source},	_from,	state)	do	
		end	
		def	handle_call({:page_download_image,	image_data,	directory},	_from,	state)
	end	
		##	Helper	functions	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
				end	
		end	
		defp	download_image({image_src,	image_filename},	directory)	do	
		end	
end
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
POOL
defmodule	PoolManagement.Worker	do	
		use	GenServer	
		...	
		def	chapter_page([chapter_link,	source])	do	
				Task.Supervisor.async(Fetcher.TaskSupervisor,	fn	->	
						:poolboy.transaction	:worker_pool,	fn(server)	->	
								GenServer.call(server,	{:chapter_page,	chapter_link,	source},	
@genserver_call_timeout)	
						end,	@task_async_timeout	
				end)	
		end	
		...	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
				links	=	source	
						|>	manga_source("ChapterPage")	
						|>	apply(:pages,	[chapter_link])	
				{:reply,	links,	state}	
		end	
		...	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
				end	
		end	
end
ChapterPage
defmodule	PoolManagement.Worker	do	
		use	GenServer	
		...	
		def	chapter_page([chapter_link,	source])	do	
				Task.Supervisor.async(Fetcher.TaskSupervisor,	fn	->	
						:poolboy.transaction	:worker_pool,	fn(server)	->	
								GenServer.call(server,	{:chapter_page,	chapter_link,	source},	
@genserver_call_timeout)	
						end,	@task_async_timeout	
				end)	
		end	
		...	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
				links	=	source	
						|>	manga_source("ChapterPage")	
						|>	apply(:pages,	[chapter_link])	
				{:reply,	links,	state}	
		end	
		...	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
				end	
		end	
end
ChapterPage
defmodule	PoolManagement.Worker	do	
		use	GenServer	
		...	
		def	chapter_page([chapter_link,	source])	do	
				Task.Supervisor.async(Fetcher.TaskSupervisor,	fn	->	
						:poolboy.transaction	:worker_pool,	fn(server)	->	
								GenServer.call(server,	{:chapter_page,	chapter_link,	source},	
@genserver_call_timeout)	
						end,	@task_async_timeout	
				end)	
		end	
		...	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
				links	=	source	
						|>	manga_source("ChapterPage")	
						|>	apply(:pages,	[chapter_link])	
				{:reply,	links,	state}	
		end	
		...	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
				end	
		end	
end
ChapterPage
defmodule	PoolManagement.Worker	do	
		use	GenServer	
		...	
		def	chapter_page([chapter_link,	source])	do	
				Task.Supervisor.async(Fetcher.TaskSupervisor,	fn	->	
						:poolboy.transaction	:worker_pool,	fn(server)	->	
								GenServer.call(server,	{:chapter_page,	chapter_link,	source},	
@genserver_call_timeout)	
						end,	@task_async_timeout	
				end)	
		end	
		...	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
				links	=	source	
						|>	manga_source("ChapterPage")	
						|>	apply(:pages,	[chapter_link])	
				{:reply,	links,	state}	
		end	
		...	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
				end	
		end	
end
ChapterPage
defmodule	PoolManagement.Worker	do	
		use	GenServer	
		...	
		def	chapter_page([chapter_link,	source])	do	
				Task.Supervisor.async(Fetcher.TaskSupervisor,	fn	->	
						:poolboy.transaction	:worker_pool,	fn(server)	->	
								GenServer.call(server,	{:chapter_page,	chapter_link,	source},	
@genserver_call_timeout)	
						end,	@task_async_timeout	
				end)	
		end	
		...	
		def	handle_call({:chapter_page,	chapter_link,	source},	_from,	state)	do	
				links	=	source	
						|>	manga_source("ChapterPage")	
						|>	apply(:pages,	[chapter_link])	
				{:reply,	links,	state}	
		end	
		...	
		defp	manga_source(source,	module)	do	
				case	source	do	
						"mangareader"	->	:"Elixir.ExMangaDownloadr.MangaReader.#{module}"	
						"mangafox"				->	:"Elixir.ExMangaDownloadr.Mangafox.#{module}"	
				end	
		end	
end
ChapterPage
defmodule	ExMangaDownloadr.Mangafox.ChapterPage	do	
		require	Logger	
		require	ExMangaDownloadr	
		def	pages(chapter_link)	do	
				ExMangaDownloadr.fetch	chapter_link,	do:	fetch_pages(chapter_link)	
		end	
		defp	fetch_pages(html,	chapter_link)	do	
				[_page|link_template]	=	chapter_link	|>	String.split("/")	|>	
Enum.reverse	
				html	
				|>	Floki.find("div[id='top_center_bar']	option")	
				|>	Floki.attribute("value")	
				|>	Enum.reject(fn	page_number	->	page_number	==	"0"	end)	
				|>	Enum.map(fn	page_number	->		
						["#{page_number}.html"|link_template]	
								|>	Enum.reverse	
								|>	Enum.join("/")	
				end)	
		end	
end
ChapterPage
ChapterPage
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
cli.ex
cli.exdefmodule	ExMangaDownloadr.CLI	do	
		alias	ExMangaDownloadr.Workflow	
		require	ExMangaDownloadr	
		def	main(args)	do	
				args	
				|>	parse_args	
				|>	process	
		end	
		...	
		defp	parse_args(args)	do	
		end	
		defp	process(:help)	do	
		end	
		defp	process(directory,	url)	do	
				File.mkdir_p!(directory)	
				File.mkdir_p!("/tmp/ex_manga_downloadr_cache")	
				manga_name	=	directory	|>	String.split("/")	|>	Enum.reverse	|>	Enum.at(0)	
				url	
						|>	Workflow.determine_source	
						|>	Workflow.chapters	
						|>	Workflow.pages	
						|>	Workflow.images_sources	
						|>	Workflow.process_downloads(directory)	
						|>	Workflow.optimize_images	
						|>	Workflow.compile_pdfs(manga_name)	
						|>	finish_process	
		end	
		defp	process_test(directory,	url)	do	
		end	
		defp	finish_process(directory)	do	
		end	
end
Workflow
Workflow
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
workflow.ex
workflow.exdefmodule	ExMangaDownloadr.Workflow	do	
		alias	PoolManagement.Worker	
		require	Logger	
		def	chapters({url,	source})	do	
		end	
		def	pages({chapter_list,	source})	do	
				pages_list	=	chapter_list	
						|>	Enum.map(&Worker.chapter_page([&1,	source]))	
						|>	Enum.map(&Task.await(&1,	@await_timeout_ms))	
						|>	Enum.reduce([],	fn	{:ok,	list},	acc	->	acc	++	list	end)	
				{pages_list,	source}	
		end	
		def	images_sources({pages_list,	source})	do	
		end	
		def	process_downloads(images_list,	directory)	do	
		end	
		def	optimize_images(directory)	do	
				Porcelain.shell("mogrify	-resize	#{@image_dimensions}	#{directory}/*.jpg")	
				directory	
		end	
		def	compile_pdfs(directory,	manga_name)	do	
		end	
end
workflow.exdefmodule	ExMangaDownloadr.Workflow	do	
		alias	PoolManagement.Worker	
		require	Logger	
		def	chapters({url,	source})	do	
		end	
		def	pages({chapter_list,	source})	do	
				pages_list	=	chapter_list	
						|>	Enum.map(&Worker.chapter_page([&1,	source]))	
						|>	Enum.map(&Task.await(&1,	@await_timeout_ms))	
						|>	Enum.reduce([],	fn	{:ok,	list},	acc	->	acc	++	list	end)	
				{pages_list,	source}	
		end	
		def	images_sources({pages_list,	source})	do	
		end	
		def	process_downloads(images_list,	directory)	do	
		end	
		def	optimize_images(directory)	do	
				Porcelain.shell("mogrify	-resize	#{@image_dimensions}	#{directory}/*.jpg")	
				directory	
		end	
		def	compile_pdfs(directory,	manga_name)	do	
		end	
end
ex_manga_downloadr —test
28.36s user 15.57s system 33% cpu 2:10.28 total
ex_manga_downloadr —test
28.36s user 15.57s system 33% cpu 2:10.28 total
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
File.mkdir_p!(directory)	
		File.mkdir_p!("/tmp/ex_manga_downloadr_cache")	
		manga_name	=	directory	|>	String.split("/")	|>	Enum.reverse	|>	Enum.at(0)	
		url	
				|>	Workflow.determine_source	
				|>	Workflow.chapters	
				|>	Workflow.pages	
				|>	Workflow.images_sources	
				|>	Workflow.process_downloads(directory)	
				|>	Workflow.optimize_images	
				|>	Workflow.compile_pdfs(manga_name)	
				|>	finish_process	
end
def	run	
		Dir.mkdir_p	@config.download_directory	
		pipe	Steps.fetch_chapters(@config)	
				.>>	Steps.fetch_pages(@config)	
				.>>	Steps.fetch_images(@config)	
				.>>	Steps.download_images(@config)	
				.>>	Steps.optimize_images(@config)	
				.>>	Steps.prepare_volumes(@config)	
				.>>	unwrap	
		puts	"Done!"	
end
		File.mkdir_p!(directory)	
		File.mkdir_p!("/tmp/ex_manga_downloadr_cache")	
		manga_name	=	directory	|>	String.split("/")	|>	Enum.reverse	|>	Enum.at(0)	
		url	
				|>	Workflow.determine_source	
				|>	Workflow.chapters	
				|>	Workflow.pages	
				|>	Workflow.images_sources	
				|>	Workflow.process_downloads(directory)	
				|>	Workflow.optimize_images	
				|>	Workflow.compile_pdfs(manga_name)	
				|>	finish_process	
end
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
defmodule	ExMangaDownloadr.MangaReader.IndexPage	do	
		require	Logger	
		require	ExMangaDownloadr	
		def	chapters(manga_root_url)	do	
				ExMangaDownloadr.fetch	manga_root_url,	do:	collect	
		end	
		defp	collect(html)	do	
				{fetch_manga_title(html),	fetch_chapters(html)}	
		end	
		defp	fetch_manga_title(html)	do	
				html	
				|>	Floki.find("#mangaproperties	h1")	
				|>	Floki.text	
		end	
		defp	fetch_chapters(html)	do	
				html	
				|>	Floki.find("#listing	a")	
				|>	Floki.attribute("href")	
		end	
end
defmodule	ExMangaDownloadr.MangaReader.IndexPage	do	
		require	Logger	
		require	ExMangaDownloadr	
		def	chapters(manga_root_url)	do	
				ExMangaDownloadr.fetch	manga_root_url,	do:	collect	
		end	
		defp	collect(html)	do	
				{fetch_manga_title(html),	fetch_chapters(html)}	
		end	
		defp	fetch_manga_title(html)	do	
				html	
				|>	Floki.find("#mangaproperties	h1")	
				|>	Floki.text	
		end	
		defp	fetch_chapters(html)	do	
				html	
				|>	Floki.find("#listing	a")	
				|>	Floki.attribute("href")	
		end	
end
defmodule	ExMangaDownloadr.MangaReader.IndexPage	do	
		require	Logger	
		require	ExMangaDownloadr	
		def	chapters(manga_root_url)	do	
				ExMangaDownloadr.fetch	manga_root_url,	do:	collect	
		end	
		defp	collect(html)	do	
				{fetch_manga_title(html),	fetch_chapters(html)}	
		end	
		defp	fetch_manga_title(html)	do	
				html	
				|>	Floki.find("#mangaproperties	h1")	
				|>	Floki.text	
		end	
		defp	fetch_chapters(html)	do	
				html	
				|>	Floki.find("#listing	a")	
				|>	Floki.attribute("href")	
		end	
end
require	"./downloadr_client"	
require	"xml"	
module	CrMangaDownloadr	
		class	Chapters	<	DownloadrClient	
				def	initialize(@domain,	@root_uri	:	String,	@cache_http	=	false)	
						super(@domain,	@cache_http)	
				end	
				def	fetch	
						html	=	get(@root_uri)	
						nodes	=	html.xpath_nodes(	
								"//table[contains(@id,	'listing')]//td//a/@href")	
						nodes.map	{	|node|	node.text.as(String)	}	
				end	
		end	
end
DownloadrClient
require	"./downloadr_client"	
require	"xml"	
module	CrMangaDownloadr	
		class	Chapters	<	DownloadrClient	
				def	initialize(@domain,	@root_uri	:	String,	@cache_http	=	false)	
						super(@domain,	@cache_http)	
				end	
				def	fetch	
						html	=	get(@root_uri)	
						nodes	=	html.xpath_nodes(	
								"//table[contains(@id,	'listing')]//td//a/@href")	
						nodes.map	{	|node|	node.text.as(String)	}	
				end	
		end	
end
DownloadrClient
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
DownloadrClient
module	CrMangaDownloadr	
		class	DownloadrClient	
				@http_client	:	HTTP::Client	
				def	initialize(@domain	:	String,	@cache_http	=	false)	
				end	
				def	get(uri	:	String)	
						cache_path	=	"/tmp/cr_manga_downloadr_cache/#{cache_filename(uri)}"	
						while	true	
								begin	
										response	=	if	@cache_http	&&	File.exists?(cache_path)	
												body	=	File.read(cache_path)	
												HTTP::Client::Response.new(200,	body)	
										else	
												@http_client.get(uri,	headers:	HTTP::Headers{		
														"User-Agent"	=>	CrMangaDownloadr::USER_AGENT	})	
										end	
										case	response.status_code	
										when	301	
												uri	=	response.headers["Location"]	
										when	200	
												if	@cache_http	&&	!File.exists?(cache_path)	
														File.open(cache_path,	"w")	do	|f|	
																f.print	response.body	
														end	
												end	
												return	XML.parse_html(response.body)	
										end	
								rescue	IO::Timeout	
										puts	"Sleeping	over	#{uri}"	
										sleep	1	
								end	
						end	
				end	
		end	
end
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
require	"fiberpool"	
module	CrMangaDownloadr	
		struct	Concurrency	
				def	initialize(@config	:	Config,	@turn_on_engine	=	true);	end	
				def	fetch(collection	:	Array(A)?,	engine_class	:	E.class,	
&block	:	A,	E?	->	Array(B)?)	:	Array(B)	
						results	=	[]	of	B	
						if	collection	
								pool	=	Fiberpool.new(collection,		
										@config.download_batch_size)	
								pool.run	do	|item|	
										engine	=	if	@turn_on_engine	
																					engine_class.new(@config.domain,		
																							@config.cache_http)	
																			end	
										if	reply	=	block.call(item,	engine)	
												results.concat(reply)	
										end	
								end	
						end	
						results	
				end	
		end	
end
fetch
Concurrency
fetch
Concurrency
module	CrMangaDownloadr	
		class	Workflow	
		end	
		module	Steps	
				def	self.fetch_chapters(config	:	Config)	
				end	
				def	self.fetch_pages(chapters	:	Array(String)?,	config	:	Config)	
						puts	"Fetching	pages	from	all	chapters	..."	
						reactor	=	Concurrency.new(config)	
						reactor.fetch(chapters,	Pages)	do	|link,	engine|	
								engine.try(&.fetch(link))	
						end	
				end	
				def	self.fetch_images(pages	:	Array(String)?,	config	:	Config)	
				end	
				def	self.download_images(images	:	Array(Image)?,	config	:	Config)	
				end	
				def	self.optimize_images(downloads	:	Array(String),	config	:	Config)	
				end	
				def	self.prepare_volumes(downloads	:	Array(String),	config	:	Config)	
				end	
		end	
end
fetch
Concurrency
module	CrMangaDownloadr	
		class	Workflow	
		end	
		module	Steps	
				def	self.fetch_chapters(config	:	Config)	
				end	
				def	self.fetch_pages(chapters	:	Array(String)?,	config	:	Config)	
						puts	"Fetching	pages	from	all	chapters	..."	
						reactor	=	Concurrency.new(config)	
						reactor.fetch(chapters,	Pages)	do	|link,	engine|	
								engine.try(&.fetch(link))	
						end	
				end	
				def	self.fetch_images(pages	:	Array(String)?,	config	:	Config)	
				end	
				def	self.download_images(images	:	Array(Image)?,	config	:	Config)	
				end	
				def	self.optimize_images(downloads	:	Array(String),	config	:	Config)	
				end	
				def	self.prepare_volumes(downloads	:	Array(String),	config	:	Config)	
				end	
		end	
end
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
cr_manga_downloadr -t
0.28s user 0.53s system 0% cpu 1:52.45 total
cr_manga_downloadr -t
0.28s user 0.53s system 0% cpu 1:52.45 total
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
.
!"" _build
# $"" ...
!"" config
# $"" config.exs
!"" deps
# !"" ...
!"" ex_manga_downloadr
!"" lib
# !"" ex_manga_downloadr
# # !"" cli.ex
# # !"" mangafox
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" mangareader
# # # !"" chapter_page.ex
# # # !"" index_page.ex
# # # $"" page.ex
# # !"" pool_management
# # # !"" supervisor.ex
# # # $"" worker.ex
# # $"" workflow.ex
# !"" ex_manga_downloadr.ex
# $"" pool_management.ex
!"" mix.exs
!"" mix.lock
!"" README.md
$"" test
!"" ex_manga_downloadr
# !"" mangafox_test.exs
# $"" mangareader_test.exs
!"" ex_manga_downloadr_test.exs
$"" test_helper.exs
61 directories, 281 files
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
.
!"" bin
# $"" manga-downloadr
!"" Gemfile
!"" Gemfile.lock
!"" lib
# !"" manga-downloadr
# # !"" chapters.rb
# # !"" concurrency.rb
# # !"" downloadr_client.rb
# # !"" image_downloader.rb
# # !"" page_image.rb
# # !"" pages.rb
# # !"" records.rb
# # !"" version.rb
# # $"" workflow.rb
# $"" manga-downloadr.rb
!"" LICENSE.txt
!"" manga-downloadr.gemspec
!"" Rakefile
!"" README.md
$"" spec
!"" fixtures
# !"" ...
!"" manga-downloadr
# !"" chapters_spec.rb
# !"" concurrency_spec.rb
# !"" image_downloader_spec.rb
# !"" page_image_spec.rb
# $"" pages_spec.rb
$"" spec_helper.rb
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
.
!"" bin
# $"" manga-downloadr
!"" Gemfile
!"" Gemfile.lock
!"" lib
# !"" manga-downloadr
# # !"" chapters.rb
# # !"" concurrency.rb
# # !"" downloadr_client.rb
# # !"" image_downloader.rb
# # !"" page_image.rb
# # !"" pages.rb
# # !"" records.rb
# # !"" version.rb
# # $"" workflow.rb
# $"" manga-downloadr.rb
!"" LICENSE.txt
!"" manga-downloadr.gemspec
!"" Rakefile
!"" README.md
$"" spec
!"" fixtures
# !"" ...
!"" manga-downloadr
# !"" chapters_spec.rb
# !"" concurrency_spec.rb
# !"" image_downloader_spec.rb
# !"" page_image_spec.rb
# $"" pages_spec.rb
$"" spec_helper.rb
.
!"" cr_manga_downloadr
!"" libs
# !"" ...
!"" LICENSE
!"" README.md
!"" shard.lock
!"" shard.yml
!"" spec
# !"" cr_manga_downloadr
# # !"" chapters_spec.cr
# # !"" concurrency_spec.cr
# # !"" image_downloader_spec.cr
# # !"" page_image_spec.cr
# # $"" pages_spec.cr
# !"" fixtures
# # !"" ...
# $"" spec_helper.cr
$"" src
!"" cr_manga_downloadr
# !"" chapters.cr
# !"" concurrency.cr
# !"" downloadr_client.cr
# !"" image_downloader.cr
# !"" page_image.cr
# !"" pages.cr
# !"" records.cr
# !"" version.cr
# $"" workflow.cr
$"" cr_manga_downloadr.cr
.
!"" bin
# $"" manga-downloadr
!"" Gemfile
!"" Gemfile.lock
!"" lib
# !"" manga-downloadr
# # !"" chapters.rb
# # !"" concurrency.rb
# # !"" downloadr_client.rb
# # !"" image_downloader.rb
# # !"" page_image.rb
# # !"" pages.rb
# # !"" records.rb
# # !"" version.rb
# # $"" workflow.rb
# $"" manga-downloadr.rb
!"" LICENSE.txt
!"" manga-downloadr.gemspec
!"" Rakefile
!"" README.md
$"" spec
!"" fixtures
# !"" ...
!"" manga-downloadr
# !"" chapters_spec.rb
# !"" concurrency_spec.rb
# !"" image_downloader_spec.rb
# !"" page_image_spec.rb
# $"" pages_spec.rb
$"" spec_helper.rb
def	run	
		Dir.mkdir_p	@config.download_directory	
		pipe	Steps.fetch_chapters(@config)	
				.>>	Steps.fetch_pages(@config)	
				.>>	Steps.fetch_images(@config)	
				.>>	Steps.download_images(@config)	
				.>>	Steps.optimize_images(@config)	
				.>>	Steps.prepare_volumes(@config)	
				.>>	unwrap	
		puts	"Done!"	
end
def	self.run(config	=	Config.new)	
		FileUtils.mkdir_p	config.download_directory	
		CM(config,	Workflow)	
				.fetch_chapters	
				.fetch_pages(config)	
				.fetch_images(config)	
				.download_images(config)	
				.optimize_images(config)	
				.prepare_volumes(config)	
				.unwrap	
		puts	"Done!"	
end
def	run	
		Dir.mkdir_p	@config.download_directory	
		pipe	Steps.fetch_chapters(@config)	
				.>>	Steps.fetch_pages(@config)	
				.>>	Steps.fetch_images(@config)	
				.>>	Steps.download_images(@config)	
				.>>	Steps.optimize_images(@config)	
				.>>	Steps.prepare_volumes(@config)	
				.>>	unwrap	
		puts	"Done!"	
end
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
#	concurrency.cr	
pool	=	Fiberpool.new(collection,	@config.download_batch_size)	
pool.run	do	|item|	
		engine	=	if	@turn_on_engine	
													engine_class.new(@config.domain,	@config.cache_http)	
											end	
		if	reply	=	block.call(item,	engine)	
				results.concat(reply)	
		end	
end
#	concurrency.cr	
pool	=	Fiberpool.new(collection,	@config.download_batch_size)	
pool.run	do	|item|	
		engine	=	if	@turn_on_engine	
													engine_class.new(@config.domain,	@config.cache_http)	
											end	
		if	reply	=	block.call(item,	engine)	
				results.concat(reply)	
		end	
end
pool				=	Thread.pool(@config.download_batch_size)	
mutex			=	Mutex.new	
results	=	[]	
collection.each	do	|item|	
		pool.process	{	
				engine		=	@turn_on_engine	?	@engine_klass.new(@config.domain,	@config.cache_http)	:	nil	
				reply	=	block.call(item,	engine)&.flatten	
				mutex.synchronize	do	
						results	+=	(	reply	||	[]	)	
				end	
		}	
end	
pool.shutdown
module	CrMangaDownloadr	
		class	Pages	<	DownloadrClient	
				def	fetch(chapter_link	:	String)	
						html	=	get(chapter_link)	
						nodes	=	html.xpath_nodes("//div[@id='selectpage']//select[@id='pageMenu']//option")	
						nodes.map	{	|node|	"#{chapter_link}/#{node.text}"	}	
				end	
		end	
end
module	CrMangaDownloadr	
		class	Pages	<	DownloadrClient	
				def	fetch(chapter_link	:	String)	
						html	=	get(chapter_link)	
						nodes	=	html.xpath_nodes("//div[@id='selectpage']//select[@id='pageMenu']//option")	
						nodes.map	{	|node|	"#{chapter_link}/#{node.text}"	}	
				end	
		end	
end
module	MangaDownloadr	
		class	Pages	<	DownloadrClient	
				def	fetch(chapter_link)	
						get	chapter_link	do	|html|	
								nodes	=	html.xpath("//div[@id='selectpage']//select[@id='pageMenu']//option")	
								nodes.map	{	|node|	[chapter_link,	node.children.to_s].join("/")	}	
						end	
				end	
		end	
end
manga-downloadr -t
16.55s user 6.65s system 17% cpu 2:13.86 total
manga-downloadr -t
16.55s user 6.65s system 17% cpu 2:13.86 total
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
Ruby/Typhoeus 124% CPU 2:38 min
Ruby/Typhoeus 124% CPU 2:38 min
Elixir 33% CPU 2:10 min
Ruby/Typhoeus 124% CPU 2:38 min
Elixir 33% CPU 2:10 min
Crystal 0% CPU 1:52 min
Ruby/Typhoeus 124% CPU 2:38 min
Elixir 33% CPU 2:10 min
Crystal 0% CPU 1:52 min
Ruby 17% CPU 2:13 min
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
Ruby Typhoeus libcurl
Ruby Typhoeus libcurl
Elixir OTP Poolboy
Ruby Typhoeus libcurl
Elixir OTP Poolboy
Crystal Fibers Fiberpool
Ruby Typhoeus libcurl
Elixir OTP Poolboy
Crystal Fibers Fiberpool
Ruby Thread Thread/Pool
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
manga-downloadr
ex_manga_downloadr
cr_manga_downloadr
manga-downloadr
ex_manga_downloadr
cr_manga_downloadr
fiberpool
cr_chainable_methods
chainable_methods
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil
PREMATURE
OPTIMIZATION
The Root of ALL Evil
THANKS
@akitaonrails
slideshare.net/akitaonrails
1 of 136

Recommended

A Journey through new Languages - Intercon 2016 by
A Journey through new Languages - Intercon 2016A Journey through new Languages - Intercon 2016
A Journey through new Languages - Intercon 2016Fabio Akita
954 views136 slides
A Journey through New Languages - Locaweb Tech Day by
A Journey through New Languages - Locaweb Tech DayA Journey through New Languages - Locaweb Tech Day
A Journey through New Languages - Locaweb Tech DayFabio Akita
690 views151 slides
A Journey through New Languages - Insiter 2017 by
A Journey through New Languages - Insiter 2017A Journey through New Languages - Insiter 2017
A Journey through New Languages - Insiter 2017Fabio Akita
1.1K views149 slides
A Journey through New Languages - Guru Sorocaba 2017 by
A Journey through New Languages - Guru Sorocaba 2017A Journey through New Languages - Guru Sorocaba 2017
A Journey through New Languages - Guru Sorocaba 2017Fabio Akita
929 views139 slides
30 Days to Elixir and Crystal and Back to Ruby by
30 Days to Elixir and Crystal and Back to Ruby30 Days to Elixir and Crystal and Back to Ruby
30 Days to Elixir and Crystal and Back to RubyFabio Akita
1.2K views138 slides
Txjs by
TxjsTxjs
TxjsPeter Higgins
8.9K views43 slides

More Related Content

What's hot

Troubleshooting Puppet by
Troubleshooting PuppetTroubleshooting Puppet
Troubleshooting PuppetThomas Howard Uphill
294 views48 slides
You are in a maze of deeply nested maps, all alike by
You are in a maze of deeply nested maps, all alikeYou are in a maze of deeply nested maps, all alike
You are in a maze of deeply nested maps, all alikeEric Normand
181 views90 slides
Jvm a brief introduction by
Jvm  a brief introductionJvm  a brief introduction
Jvm a brief introductionArtem Shoobovych
237 views40 slides
Geeks Anonymes - Le langage Go by
Geeks Anonymes - Le langage GoGeeks Anonymes - Le langage Go
Geeks Anonymes - Le langage GoGeeks Anonymes
237 views42 slides
JavaScript - Like a Box of Chocolates by
JavaScript - Like a Box of ChocolatesJavaScript - Like a Box of Chocolates
JavaScript - Like a Box of ChocolatesRobert Nyman
8.1K views100 slides
Don't do this by
Don't do thisDon't do this
Don't do thisRichard Jones
86.7K views82 slides

What's hot(12)

You are in a maze of deeply nested maps, all alike by Eric Normand
You are in a maze of deeply nested maps, all alikeYou are in a maze of deeply nested maps, all alike
You are in a maze of deeply nested maps, all alike
Eric Normand181 views
Geeks Anonymes - Le langage Go by Geeks Anonymes
Geeks Anonymes - Le langage GoGeeks Anonymes - Le langage Go
Geeks Anonymes - Le langage Go
Geeks Anonymes237 views
JavaScript - Like a Box of Chocolates by Robert Nyman
JavaScript - Like a Box of ChocolatesJavaScript - Like a Box of Chocolates
JavaScript - Like a Box of Chocolates
Robert Nyman8.1K views
PyCon 2013 : Scripting to PyPi to GitHub and More by Matt Harrison
PyCon 2013 : Scripting to PyPi to GitHub and MorePyCon 2013 : Scripting to PyPi to GitHub and More
PyCon 2013 : Scripting to PyPi to GitHub and More
Matt Harrison5.7K views
Why Python (for Statisticians) by Matt Harrison
Why Python (for Statisticians)Why Python (for Statisticians)
Why Python (for Statisticians)
Matt Harrison2.8K views
Programming Under Linux In Python by Marwan Osman
Programming Under Linux In PythonProgramming Under Linux In Python
Programming Under Linux In Python
Marwan Osman6.4K views
GeoMapper, Python Script for Visualizing Data on Social Networks with Geo-loc... by Marcel Caraciolo
GeoMapper, Python Script for Visualizing Data on Social Networks with Geo-loc...GeoMapper, Python Script for Visualizing Data on Social Networks with Geo-loc...
GeoMapper, Python Script for Visualizing Data on Social Networks with Geo-loc...
Marcel Caraciolo8.9K views
EuroPython 2016 - Do I Need To Switch To Golang by Max Tepkeev
EuroPython 2016 - Do I Need To Switch To GolangEuroPython 2016 - Do I Need To Switch To Golang
EuroPython 2016 - Do I Need To Switch To Golang
Max Tepkeev1.2K views

Viewers also liked

InterCon 2016 - Desafios de conectividade de dispositivos em realtime by
InterCon 2016 - Desafios de conectividade de dispositivos em realtimeInterCon 2016 - Desafios de conectividade de dispositivos em realtime
InterCon 2016 - Desafios de conectividade de dispositivos em realtimeiMasters
976 views23 slides
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/... by
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...iMasters
1.2K views31 slides
InterCon 2016 - Segurança de identidade digital levando em consideração uma a... by
InterCon 2016 - Segurança de identidade digital levando em consideração uma a...InterCon 2016 - Segurança de identidade digital levando em consideração uma a...
InterCon 2016 - Segurança de identidade digital levando em consideração uma a...iMasters
1.2K views34 slides
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI by
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROITomás Händel Trojan
1.1K views28 slides
Android DevConference - Firebase para desenvolvedores by
Android DevConference - Firebase para desenvolvedoresAndroid DevConference - Firebase para desenvolvedores
Android DevConference - Firebase para desenvolvedoresiMasters
448 views46 slides
InterCon 2012 - Metricas - Search Marketing Optimization na Prática by
InterCon 2012 - Metricas - Search Marketing Optimization na PráticaInterCon 2012 - Metricas - Search Marketing Optimization na Prática
InterCon 2012 - Metricas - Search Marketing Optimization na PráticaiMasters
458 views36 slides

Viewers also liked(11)

InterCon 2016 - Desafios de conectividade de dispositivos em realtime by iMasters
InterCon 2016 - Desafios de conectividade de dispositivos em realtimeInterCon 2016 - Desafios de conectividade de dispositivos em realtime
InterCon 2016 - Desafios de conectividade de dispositivos em realtime
iMasters976 views
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/... by iMasters
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...
InterCon 2016 - BioHacking: criando dispositivos de biotecnologia OpenSource/...
iMasters1.2K views
InterCon 2016 - Segurança de identidade digital levando em consideração uma a... by iMasters
InterCon 2016 - Segurança de identidade digital levando em consideração uma a...InterCon 2016 - Segurança de identidade digital levando em consideração uma a...
InterCon 2016 - Segurança de identidade digital levando em consideração uma a...
iMasters1.2K views
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI by Tomás Händel Trojan
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI
15 Práticas para você aplicar hoje em Search Marketing e Melhorar o seu ROI
Android DevConference - Firebase para desenvolvedores by iMasters
Android DevConference - Firebase para desenvolvedoresAndroid DevConference - Firebase para desenvolvedores
Android DevConference - Firebase para desenvolvedores
iMasters448 views
InterCon 2012 - Metricas - Search Marketing Optimization na Prática by iMasters
InterCon 2012 - Metricas - Search Marketing Optimization na PráticaInterCon 2012 - Metricas - Search Marketing Optimization na Prática
InterCon 2012 - Metricas - Search Marketing Optimization na Prática
iMasters458 views
Android DevConference - Indo além com automação de testes de apps Android by iMasters
Android DevConference - Indo além com automação de testes de apps AndroidAndroid DevConference - Indo além com automação de testes de apps Android
Android DevConference - Indo além com automação de testes de apps Android
iMasters292 views
InterCon 2016 - Internet of “Thinking” – IoT sem BS com ESP8266 by iMasters
InterCon 2016 - Internet of “Thinking” – IoT sem BS com ESP8266InterCon 2016 - Internet of “Thinking” – IoT sem BS com ESP8266
InterCon 2016 - Internet of “Thinking” – IoT sem BS com ESP8266
iMasters1.3K views
Android DevConference - Android Clean Architecture by iMasters
Android DevConference - Android Clean ArchitectureAndroid DevConference - Android Clean Architecture
Android DevConference - Android Clean Architecture
iMasters777 views
InterCon 2016 - Gerenciando deploy e atualização de 450 apps sem enlouquecer by iMasters
InterCon 2016 - Gerenciando deploy e atualização de 450 apps sem enlouquecerInterCon 2016 - Gerenciando deploy e atualização de 450 apps sem enlouquecer
InterCon 2016 - Gerenciando deploy e atualização de 450 apps sem enlouquecer
iMasters1.4K views
InterCon 2016 - Backend do IoT com RethinkDB e Python by iMasters
InterCon 2016 - Backend do IoT com RethinkDB e PythonInterCon 2016 - Backend do IoT com RethinkDB e Python
InterCon 2016 - Backend do IoT com RethinkDB e Python
iMasters1.5K views

Similar to InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil

A Journey through New Languages - Rancho Dev 2017 by
A Journey through New Languages - Rancho Dev 2017A Journey through New Languages - Rancho Dev 2017
A Journey through New Languages - Rancho Dev 2017Fabio Akita
396 views139 slides
Configuration Surgery with Augeas by
Configuration Surgery with AugeasConfiguration Surgery with Augeas
Configuration Surgery with AugeasPuppet
12.7K views38 slides
Metadata-driven Testing by
Metadata-driven TestingMetadata-driven Testing
Metadata-driven TestingWorkhorse Computing
511 views84 slides
Puppi. Puppet strings to the shell by
Puppi. Puppet strings to the shellPuppi. Puppet strings to the shell
Puppi. Puppet strings to the shellAlessandro Franceschi
5.8K views33 slides
Introducing Command Line Applications with Ruby by
Introducing Command Line Applications with RubyIntroducing Command Line Applications with Ruby
Introducing Command Line Applications with RubyNikhil Mungel
2.9K views63 slides
Thumbtack Expertise Days # 5 - Javaz by
Thumbtack Expertise Days # 5 - JavazThumbtack Expertise Days # 5 - Javaz
Thumbtack Expertise Days # 5 - JavazAlexey Remnev
333 views56 slides

Similar to InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil(20)

A Journey through New Languages - Rancho Dev 2017 by Fabio Akita
A Journey through New Languages - Rancho Dev 2017A Journey through New Languages - Rancho Dev 2017
A Journey through New Languages - Rancho Dev 2017
Fabio Akita396 views
Configuration Surgery with Augeas by Puppet
Configuration Surgery with AugeasConfiguration Surgery with Augeas
Configuration Surgery with Augeas
Puppet12.7K views
Introducing Command Line Applications with Ruby by Nikhil Mungel
Introducing Command Line Applications with RubyIntroducing Command Line Applications with Ruby
Introducing Command Line Applications with Ruby
Nikhil Mungel2.9K views
Thumbtack Expertise Days # 5 - Javaz by Alexey Remnev
Thumbtack Expertise Days # 5 - JavazThumbtack Expertise Days # 5 - Javaz
Thumbtack Expertise Days # 5 - Javaz
Alexey Remnev333 views
Shell实现的windows回收站功能的脚本 by Lingfei Kong
Shell实现的windows回收站功能的脚本Shell实现的windows回收站功能的脚本
Shell实现的windows回收站功能的脚本
Lingfei Kong821 views
Debugging in Clojure: Finding Light in the Darkness using Emacs and Cursive by Ahmad Ragab
Debugging in Clojure: Finding Light in the Darkness using Emacs and CursiveDebugging in Clojure: Finding Light in the Darkness using Emacs and Cursive
Debugging in Clojure: Finding Light in the Darkness using Emacs and Cursive
Ahmad Ragab378 views
DPC 2012 : PHP in the Dark Workshop by Jeroen Keppens
DPC 2012 : PHP in the Dark WorkshopDPC 2012 : PHP in the Dark Workshop
DPC 2012 : PHP in the Dark Workshop
Jeroen Keppens3.7K views
Code that gets you pwn(s|'d) by snyff
Code that gets you pwn(s|'d)Code that gets you pwn(s|'d)
Code that gets you pwn(s|'d)
snyff2.6K views
COSCUP2012: How to write a bash script like the python? by Lloyd Huang
COSCUP2012: How to write a bash script like the python?COSCUP2012: How to write a bash script like the python?
COSCUP2012: How to write a bash script like the python?
Lloyd Huang3.9K views
Augeas by lutter
AugeasAugeas
Augeas
lutter522 views
autopkgtest lightning talk by martin-pitt
autopkgtest lightning talkautopkgtest lightning talk
autopkgtest lightning talk
martin-pitt166 views
Python utan-stodhjul-motorsag by niklal
Python utan-stodhjul-motorsagPython utan-stodhjul-motorsag
Python utan-stodhjul-motorsag
niklal527 views
Mengembalikan data yang terhapus atau rusak pada hardisk menggunakan ubuntu by Alferizhy Chalter
Mengembalikan data yang terhapus atau rusak pada hardisk menggunakan ubuntuMengembalikan data yang terhapus atau rusak pada hardisk menggunakan ubuntu
Mengembalikan data yang terhapus atau rusak pada hardisk menggunakan ubuntu
Alferizhy Chalter915 views
Introduction to Snakemake by Paul Agapow
Introduction to SnakemakeIntroduction to Snakemake
Introduction to Snakemake
Paul Agapow905 views

More from iMasters

O que você precisa saber para modelar bancos de dados NoSQL - Dani Monteiro by
O que você precisa saber para modelar bancos de dados NoSQL - Dani MonteiroO que você precisa saber para modelar bancos de dados NoSQL - Dani Monteiro
O que você precisa saber para modelar bancos de dados NoSQL - Dani MonteiroiMasters
1.4K views40 slides
Postgres: wanted, beloved or dreaded? - Fabio Telles by
Postgres: wanted, beloved or dreaded? - Fabio TellesPostgres: wanted, beloved or dreaded? - Fabio Telles
Postgres: wanted, beloved or dreaded? - Fabio TellesiMasters
603 views51 slides
Por que minha query esta lenta? - Suellen Moraes by
Por que minha query esta lenta? - Suellen MoraesPor que minha query esta lenta? - Suellen Moraes
Por que minha query esta lenta? - Suellen MoraesiMasters
370 views12 slides
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig... by
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...iMasters
298 views9 slides
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalves by
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalvesORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalves
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalvesiMasters
324 views31 slides
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -... by
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...iMasters
1.7K views52 slides

More from iMasters(20)

O que você precisa saber para modelar bancos de dados NoSQL - Dani Monteiro by iMasters
O que você precisa saber para modelar bancos de dados NoSQL - Dani MonteiroO que você precisa saber para modelar bancos de dados NoSQL - Dani Monteiro
O que você precisa saber para modelar bancos de dados NoSQL - Dani Monteiro
iMasters1.4K views
Postgres: wanted, beloved or dreaded? - Fabio Telles by iMasters
Postgres: wanted, beloved or dreaded? - Fabio TellesPostgres: wanted, beloved or dreaded? - Fabio Telles
Postgres: wanted, beloved or dreaded? - Fabio Telles
iMasters603 views
Por que minha query esta lenta? - Suellen Moraes by iMasters
Por que minha query esta lenta? - Suellen MoraesPor que minha query esta lenta? - Suellen Moraes
Por que minha query esta lenta? - Suellen Moraes
iMasters370 views
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig... by iMasters
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...
Relato das trincheiras: o dia a dia de uma consultoria de banco de dados - Ig...
iMasters298 views
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalves by iMasters
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalvesORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalves
ORMs heróis ou vilões dentro da arquitetura de dados? - Otávio gonçalves
iMasters324 views
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -... by iMasters
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...
SQL e NoSQL trabalhando juntos: uma comparação para obter o melhor de ambos -...
iMasters1.7K views
Arquitetando seus dados na prática para a LGPD - Alessandra Martins by iMasters
Arquitetando seus dados na prática para a LGPD - Alessandra MartinsArquitetando seus dados na prática para a LGPD - Alessandra Martins
Arquitetando seus dados na prática para a LGPD - Alessandra Martins
iMasters3.3K views
O papel do DBA no mundo de ciência de dados e machine learning - Mauro Pichil... by iMasters
O papel do DBA no mundo de ciência de dados e machine learning - Mauro Pichil...O papel do DBA no mundo de ciência de dados e machine learning - Mauro Pichil...
O papel do DBA no mundo de ciência de dados e machine learning - Mauro Pichil...
iMasters287 views
Desenvolvimento Mobile Híbrido, Nativo ou Web: Quando usá-los - Juliana Chahoud by iMasters
Desenvolvimento Mobile Híbrido, Nativo ou Web: Quando usá-los - Juliana ChahoudDesenvolvimento Mobile Híbrido, Nativo ou Web: Quando usá-los - Juliana Chahoud
Desenvolvimento Mobile Híbrido, Nativo ou Web: Quando usá-los - Juliana Chahoud
iMasters950 views
Use MDD e faça as máquinas trabalharem para você - Andreza Leite by iMasters
 Use MDD e faça as máquinas trabalharem para você - Andreza Leite Use MDD e faça as máquinas trabalharem para você - Andreza Leite
Use MDD e faça as máquinas trabalharem para você - Andreza Leite
iMasters682 views
Entendendo os porquês do seu servidor - Talita Bernardes by iMasters
Entendendo os porquês do seu servidor - Talita BernardesEntendendo os porquês do seu servidor - Talita Bernardes
Entendendo os porquês do seu servidor - Talita Bernardes
iMasters544 views
Backend performático além do "coloca mais máquina lá" - Diana Arnos by iMasters
Backend performático além do "coloca mais máquina lá" - Diana ArnosBackend performático além do "coloca mais máquina lá" - Diana Arnos
Backend performático além do "coloca mais máquina lá" - Diana Arnos
iMasters477 views
Dicas para uma maior performance em APIs REST - Renato Groffe by iMasters
Dicas para uma maior performance em APIs REST - Renato GroffeDicas para uma maior performance em APIs REST - Renato Groffe
Dicas para uma maior performance em APIs REST - Renato Groffe
iMasters595 views
7 dicas de desempenho que equivalem por 21 - Danielle Monteiro by iMasters
7 dicas de desempenho que equivalem por 21 - Danielle Monteiro7 dicas de desempenho que equivalem por 21 - Danielle Monteiro
7 dicas de desempenho que equivalem por 21 - Danielle Monteiro
iMasters475 views
Quem se importa com acessibilidade Web? - Mauricio Maujor by iMasters
Quem se importa com acessibilidade Web? - Mauricio MaujorQuem se importa com acessibilidade Web? - Mauricio Maujor
Quem se importa com acessibilidade Web? - Mauricio Maujor
iMasters480 views
Service Mesh com Istio e Kubernetes - Wellington Figueira da Silva by iMasters
Service Mesh com Istio e Kubernetes - Wellington Figueira da SilvaService Mesh com Istio e Kubernetes - Wellington Figueira da Silva
Service Mesh com Istio e Kubernetes - Wellington Figueira da Silva
iMasters603 views
Erros: Como eles vivem, se alimentam e se reproduzem? - Augusto Pascutti by iMasters
Erros: Como eles vivem, se alimentam e se reproduzem? - Augusto PascuttiErros: Como eles vivem, se alimentam e se reproduzem? - Augusto Pascutti
Erros: Como eles vivem, se alimentam e se reproduzem? - Augusto Pascutti
iMasters559 views
Elasticidade e engenharia de banco de dados para alta performance - Rubens G... by iMasters
Elasticidade e engenharia de banco de dados para alta performance  - Rubens G...Elasticidade e engenharia de banco de dados para alta performance  - Rubens G...
Elasticidade e engenharia de banco de dados para alta performance - Rubens G...
iMasters569 views
Construindo aplicações mais confiantes - Carolina Karklis by iMasters
Construindo aplicações mais confiantes - Carolina KarklisConstruindo aplicações mais confiantes - Carolina Karklis
Construindo aplicações mais confiantes - Carolina Karklis
iMasters477 views
Monitoramento de Aplicações - Felipe Regalgo by iMasters
Monitoramento de Aplicações - Felipe RegalgoMonitoramento de Aplicações - Felipe Regalgo
Monitoramento de Aplicações - Felipe Regalgo
iMasters709 views

Recently uploaded

"Fast Start to Building on AWS", Igor Ivaniuk by
"Fast Start to Building on AWS", Igor Ivaniuk"Fast Start to Building on AWS", Igor Ivaniuk
"Fast Start to Building on AWS", Igor IvaniukFwdays
36 views76 slides
"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur by
"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur
"Thriving Culture in a Product Company — Practical Story", Volodymyr TsukurFwdays
40 views31 slides
Micron CXL product and architecture update by
Micron CXL product and architecture updateMicron CXL product and architecture update
Micron CXL product and architecture updateCXL Forum
27 views7 slides
"How we switched to Kanban and how it integrates with product planning", Vady... by
"How we switched to Kanban and how it integrates with product planning", Vady..."How we switched to Kanban and how it integrates with product planning", Vady...
"How we switched to Kanban and how it integrates with product planning", Vady...Fwdays
61 views24 slides
GigaIO: The March of Composability Onward to Memory with CXL by
GigaIO: The March of Composability Onward to Memory with CXLGigaIO: The March of Composability Onward to Memory with CXL
GigaIO: The March of Composability Onward to Memory with CXLCXL Forum
126 views12 slides
Java Platform Approach 1.0 - Picnic Meetup by
Java Platform Approach 1.0 - Picnic MeetupJava Platform Approach 1.0 - Picnic Meetup
Java Platform Approach 1.0 - Picnic MeetupRick Ossendrijver
25 views39 slides

Recently uploaded(20)

"Fast Start to Building on AWS", Igor Ivaniuk by Fwdays
"Fast Start to Building on AWS", Igor Ivaniuk"Fast Start to Building on AWS", Igor Ivaniuk
"Fast Start to Building on AWS", Igor Ivaniuk
Fwdays36 views
"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur by Fwdays
"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur
"Thriving Culture in a Product Company — Practical Story", Volodymyr Tsukur
Fwdays40 views
Micron CXL product and architecture update by CXL Forum
Micron CXL product and architecture updateMicron CXL product and architecture update
Micron CXL product and architecture update
CXL Forum27 views
"How we switched to Kanban and how it integrates with product planning", Vady... by Fwdays
"How we switched to Kanban and how it integrates with product planning", Vady..."How we switched to Kanban and how it integrates with product planning", Vady...
"How we switched to Kanban and how it integrates with product planning", Vady...
Fwdays61 views
GigaIO: The March of Composability Onward to Memory with CXL by CXL Forum
GigaIO: The March of Composability Onward to Memory with CXLGigaIO: The March of Composability Onward to Memory with CXL
GigaIO: The March of Composability Onward to Memory with CXL
CXL Forum126 views
[2023] Putting the R! in R&D.pdf by Eleanor McHugh
[2023] Putting the R! in R&D.pdf[2023] Putting the R! in R&D.pdf
[2023] Putting the R! in R&D.pdf
Eleanor McHugh38 views
Samsung: CMM-H Tiered Memory Solution with Built-in DRAM by CXL Forum
Samsung: CMM-H Tiered Memory Solution with Built-in DRAMSamsung: CMM-H Tiered Memory Solution with Built-in DRAM
Samsung: CMM-H Tiered Memory Solution with Built-in DRAM
CXL Forum105 views
Liqid: Composable CXL Preview by CXL Forum
Liqid: Composable CXL PreviewLiqid: Composable CXL Preview
Liqid: Composable CXL Preview
CXL Forum121 views
How to reduce cold starts for Java Serverless applications in AWS at JCON Wor... by Vadym Kazulkin
How to reduce cold starts for Java Serverless applications in AWS at JCON Wor...How to reduce cold starts for Java Serverless applications in AWS at JCON Wor...
How to reduce cold starts for Java Serverless applications in AWS at JCON Wor...
Vadym Kazulkin70 views
"Ukrainian Mobile Banking Scaling in Practice. From 0 to 100 and beyond", Vad... by Fwdays
"Ukrainian Mobile Banking Scaling in Practice. From 0 to 100 and beyond", Vad..."Ukrainian Mobile Banking Scaling in Practice. From 0 to 100 and beyond", Vad...
"Ukrainian Mobile Banking Scaling in Practice. From 0 to 100 and beyond", Vad...
Fwdays40 views
"AI Startup Growth from Idea to 1M ARR", Oleksandr Uspenskyi by Fwdays
"AI Startup Growth from Idea to 1M ARR", Oleksandr Uspenskyi"AI Startup Growth from Idea to 1M ARR", Oleksandr Uspenskyi
"AI Startup Growth from Idea to 1M ARR", Oleksandr Uspenskyi
Fwdays26 views
Beyond the Hype: What Generative AI Means for the Future of Work - Damien Cum... by NUS-ISS
Beyond the Hype: What Generative AI Means for the Future of Work - Damien Cum...Beyond the Hype: What Generative AI Means for the Future of Work - Damien Cum...
Beyond the Hype: What Generative AI Means for the Future of Work - Damien Cum...
NUS-ISS28 views
Data-centric AI and the convergence of data and model engineering: opportunit... by Paolo Missier
Data-centric AI and the convergence of data and model engineering:opportunit...Data-centric AI and the convergence of data and model engineering:opportunit...
Data-centric AI and the convergence of data and model engineering: opportunit...
Paolo Missier29 views
PharoJS - Zürich Smalltalk Group Meetup November 2023 by Noury Bouraqadi
PharoJS - Zürich Smalltalk Group Meetup November 2023PharoJS - Zürich Smalltalk Group Meetup November 2023
PharoJS - Zürich Smalltalk Group Meetup November 2023
Noury Bouraqadi113 views
AMD: 4th Generation EPYC CXL Demo by CXL Forum
AMD: 4th Generation EPYC CXL DemoAMD: 4th Generation EPYC CXL Demo
AMD: 4th Generation EPYC CXL Demo
CXL Forum126 views

InterCon 2016 - Performance, anti-patterns e stacks para desenvolvimento ágil