Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Neuroevolution in Elixir

21 views

Published on

Presented at EMPEX NYC 2018

Published in: Software
  • Be the first to comment

  • Be the first to like this

Neuroevolution in Elixir

  1. 1. Neuroevolution In Elixir Jeff Smith @jeffksmithjr jeffsmith.tech
  2. 2. init/1
  3. 3. How can we build technology
  4. 4. That learns from biology
  5. 5. That serves humans
  6. 6. Deep Learning
  7. 7. Evolution
  8. 8. Neuroevolution in Elixir
  9. 9. Evolution def evolve(nets, generations) do evolve(nets, generations, &decrement/1) end def evolve(nets, generations, count_function) when generations > 0 do Task.Supervisor.async(GN.TaskSupervisor, fn -> IO.puts("Generations remaining: #{generations}") learn_generation(nets) |> select() |> evolve(count_function.(generations), count_function) end) end def evolve(nets, _generations, _count_function) do nets end
  10. 10. Evolution def learn_generation(nets) do clean_nets = strip_empties(nets) tasks = Task.Supervisor.async_stream_nolink( GN.TaskSupervisor, clean_nets, &start_and_spawn(&1), timeout: GN.Parameters.get(__MODULE__, :timeout) ) generation = for {status, net} <- tasks, status == :ok, do: net IO.puts(inspect(generation)) generation end
  11. 11. Evolution def spawn_offspring(seed_layers, mutation_rate @mutation_rate) do duplicate(seed_layers, mutation_rate) |> remove(mutation_rate) |> Enum.map(&mutate(&1, mutation_rate)) end
  12. 12. Mutation def mutate_params(params, mutation_rate) do for param <- params do cond do should_mutate(mutation_rate) -> cond do is_atom(param) -> Enum.take_random(activation_functions(), 1) |> hd() is_integer(param) -> Statistics.Distributions.Normal.rand(param, @std_dev) |> Statistics.Math.to_int() is_float(param) -> :rand.uniform() end ...
  13. 13. Diversity in Ecosystems
  14. 14. Selection def select(pid __MODULE__, nets) do cutoffs = cutoffs(nets) for net <- nets do complexity = length(net.layers) level = Enum.min( [Enum.find_index(cutoffs, &(&1 >= complexity)) + 1, complexity_levels()]) net_acc = net.test_acc elite_acc = Map.get(get(level), :test_acc) if is_nil(elite_acc) or net_acc > elite_acc do put(pid, level, net) end end
  15. 15. Evolution iex(1)> GN.Selection.get_all() %{ 1 => %GN.Network{ id: "0c2020ad-8944-4f2c-80bd-1d92c9d26535", layers: [ dense: [64, :softrelu], batch_norm: [], activation: [:relu], dropout: [0.5], dense: [63, :relu] ], test_acc: 0.8553 }, 2 => %GN.Network{ id: "58229333-a05d-4371-8f23-e8e55c37a2ec", layers: [ dense: [64, :relu],
  16. 16. Continual Learning
  17. 17. Continual Learning iex(2)> GN.Example.infinite_example() %Task{ owner: #PID<0.171.0>, pid: #PID<0.213.0>, ref: #Reference<0.1968944036.911736833.180535> } Generations remaining: infinity
  18. 18. Continual Learning def evolve_continual(nets) do evolve(nets, :infinity, & &1) end def evolve(nets, generations, count_function) when generations > 0 do Task.Supervisor.async(GN.TaskSupervisor, fn -> IO.puts("Generations remaining: #{generations}") learn_generation(nets) |> select() |> evolve(count_function.(generations), count_function) end) end
  19. 19. Interactive Evolution
  20. 20. Continual Learning iex(2)> GN.Example.infinite_example() %Task{ owner: #PID<0.171.0>, pid: #PID<0.213.0>, ref: #Reference<0.1968944036.911736833.180535> } Generations remaining: infinity
  21. 21. Interactive Evolution iex(3)> GN.Parameters.put(GN.Selection, %{complexity_levels: 4}) :ok
  22. 22. Model Library defmodule GN.Library do use Agent def start_link(opts []) do opts = Keyword.put_new(opts, :name, __MODULE__) Agent.start_link(fn -> %{} end, opts) end def put(pid __MODULE__, net) do Agent.update(pid, &Map.put(&1, net.id, net)) end def get_all(pid __MODULE__) do Agent.get(pid, & &1) end end
  23. 23. Interactive Evolution iex(4)> GN.Selection.get_all() |> Map.get(2) |> GN.Library.put() iex(5)> GN.Library.get("02b2a947-f888-4abf-b2a5-5df25668b0ee") |> GN.Selection.put_unevaluated()
  24. 24. Genotypes
  25. 25. Genotypes: V1 %GN.Network{ id: "0c2020ad-8944-4f2c-80bd-1d92c9d26535", layers: [ dense: [64, :softrelu], batch_norm: [], activation: [:relu], dropout: [0.5], dense: [63, :relu] ], test_acc: 0.8553 }
  26. 26. Layers in Elixir def dense(py, n, activation) do act_type = Atom.to_string(activation) py |> call(dense(n, act_type)) end def activation(py, activation) do act_type = Atom.to_string(activation) py |> call(activation(act_type)) end def dropout(py, rate) do py |> call(dropout(rate)) end def batch_norm(py) do py |> call(batch_norm()) end
  27. 27. Layers in Python def dense(n, act_type): act_type_str = act_type.decode("UTF-8") if act_type_str == "none": act_type_str = None return gluon.nn.Dense(n, activation=act_type_str) def activation(act_type): return gluon.nn.Activation(act_type.decode("UTF-8")) def dropout(rate): return gluon.nn.Dropout(rate) def batch_norm(): return gluon.nn.BatchNorm()
  28. 28. Galápagos Nǎo Tech Stack ● Elixir ● Apache MXNet/Gluon ● Python ● Export/ErlPort ● Docker ● Microsoft Cognitive Toolkit* ● ONNX* ● Caffe2* * WIP
  29. 29. Models
  30. 30. ONNX Format message AttributeProto { enum AttributeType { UNDEFINED = 0; FLOAT = 1; INT = 2; STRING = 3; TENSOR = 4; GRAPH = 5; FLOATS = 6; INTS = 7; STRINGS = 8; TENSORS = 9; GRAPHS = 10; }
  31. 31. Loading ONNX Models iex(1)> {:ok, mnist_data} = File.read "./test/examples/mnist.onnx" {:ok, <<8, 3, 18, 4, 67, 78, 84, 75, 26, 3, 50, 46, 52, 40, 1, 58, 227, 206, 1, 10, 199, 80, 18, 12, 80, 97, 114, 97, 109, 101, 116, 101, 114, 49, 57, 51, 26, 12, 80, 97, 114, 97, 109, 101, 116, 101, 114, 49, ...>>}
  32. 32. ONNXS: Genotypes V2 iex(2)> mnist_struct = Onnx.ModelProto.decode(mnist_data) %Onnx.ModelProto{ doc_string: nil, domain: nil, graph: %Onnx.GraphProto{ doc_string: nil, initializer: [], input: [ %Onnx.ValueInfoProto{ doc_string: nil, name: "Input3", type: %Onnx.TypeProto{ value: {:tensor_type, %Onnx.TypeProto.Tensor{ elem_type: 1, shape: %Onnx.TensorShapeProto ...
  33. 33. ONNXS iex(3)> mnist_updated = %{mnist_struct | model_version: 2}
  34. 34. def mutate( %Onnx.NodeProto{ attribute: [%Onnx.AttributeProto{t: %Onnx.TensorProto{float_data: float_data}}] } = layer, mutation_rate ) do mutated_data = mutate_params(float_data, mutation_rate) update_in( layer, [Access.key!(:attribute), Access.all(), Access.key!(:t), Access.key!(:float_data)], fn d -> mutated_data end ) end Mutation V2
  35. 35. Mutation V2 def mutate_params(params, mutation_rate) do for param <- params do cond do should_mutate(mutation_rate) -> cond do is_float(param) -> Statistics.Distributions.Normal.rand(param, @std_dev) end true -> param end end end
  36. 36. Opportunities
  37. 37. Tools for ONNX
  38. 38. Elixir MXNet bindings
  39. 39. Parameter discovery for Python deep learning tools
  40. 40. Phoenix-based model server
  41. 41. EVM numerical computing libraries
  42. 42. Embodied AI
  43. 43. terminate/2
  44. 44. Use the code ctwempex18 for 40% off all Manning books.
  45. 45. References
  46. 46. Neuroevolution In Elixir Jeff Smith @jeffksmithjr jeffsmith.tech

×