Skip to content

Instantly share code, notes, and snippets.

@dmitrinesterenko
Last active May 21, 2016 21:26
Show Gist options
  • Save dmitrinesterenko/b878dd96756b621c7e5398398083f24e to your computer and use it in GitHub Desktop.
Save dmitrinesterenko/b878dd96756b621c7e5398398083f24e to your computer and use it in GitHub Desktop.
Lunch conversations
docker docker stuff stuff docker docker docker
met two Andrews and a Michael from crossfit and living social, they are not using Elixir in production
Dockyard CEO on creating packages
Document your packages with Markdown
use typespecs: @spec to provide information about the "type" that your functions expect
@spec (doit: boolean) : boolean
def drop_database(doit)
end
use Dialyzer to lint your functions for potential errors in type.
To continue developing while integrating your package elsewhere
defp do
[{:crazy_pants, :path ="/path/to/crazy_pants_package"}]
end
pry, console, logger, IO.puts for debugging
Use semantic versioning for versioning (please)
ELM package manager does whatever NPM does but the complete opposite. So it actually works and yells at you if you have not bumped a version properly.
` hex.publish` to publish your package
upload documentation with
`mix hex.docs`
promote your package
reddit.com/r/elixir
elixir-lang-talk g group
phoenix-talk google group
https://elixirstatus.com
Elixir slack
www.meetup.com/topics/elixir
Journey to the center of the Beam from a person from Jet (Jet uses F# entirely)
Make it a habit to look at the source code of the tools you use.
Elixir github
Erlang code goes into /src
Elixir code goes into /lib
elixir_parser.yrl describes the Elixir parser
Erlang OTP on github
/lib is soure so /lib/compiler/src is the Erlang compiler
gen.erl -> do_call(Process, Label, Request, Timeout) is called when you make a "call"
Optimizing an existing system can help avoid the need to scale up to more nodes.
Languages around Erlang are coming up (LFE)
Yacc - yet another compiler compiler for LALR grammars
__ENV__ explains why quote behaves it does in the current context
and __CALLER__
Protocols
tagging with __struct__ is useful for protocol dispatch, we want an operation for a specific type
__ENV__.__struct__
merl: sharing immutable constants via the code server
to dump erlang core
erlc +to_core fizzbuzz.erl
or
c(module_buzz)
This will loose all the sugar
this syntax is for serialization not for reading
Optimizations that are done...that you can do?
* inlining
* constant folding
* tuple operations
* list module calls -- remote calls to the list module
erlc +to_kernel fizzbuzz.erl
* you will see remote and local calls
* erlang will make remote calls to the "code server"
* local calls will be in your own module
BEAM (is a Virtual Machine) is inspired by WAM of Prolog
loading BEAM code takes some time because it's a complicated process
* Instruction Fusing
* Direct threading
erts_debug module is the best tool to understand Erlang processes that is undocumented
:erts_debug.df(:fizzbuzz)
gives us a fizzbuzz.dis
gives you address -> function mapping
Erlang documentation has a great efficiency guide
Erlang virtual machine is written in ... C
Writing idiomatic Erlang makes for more efficient erlang because the language and the VM grew together so the
VM will support well written Erlang.
Micro patterns in Elixir
Presenter made a point that learning in small projects is better than trying to take on a large scale project in a new
language
https://projecteuler.net
http://adventofcode.com
"Micropatterns"
Small practices. Immutability, we understand what it is but you need time to start really using them, not trying to fight them.
#1 pipelines
#2 handle happy cases with pattern matching
#3 RECURSION. Back in a day people used recusion only on job interviews.
Now in Elixir ...you would actually do it. You must do it!
As a software developer I don't wnat to have insight when going through my day to day. I want to methodically apply what I know to solve
a given problem.
Scott Wlaschin : Functional design patterns
in OOP we have objects in the large and methods in the small
in FP we have functions in the large and functions in the small
Ex: adding an error property to you struct is easy. But that's how Phoenix plugs work.
Trees in Elixir
%Bee { boss: nil, bee_kids: [] }
# so now you can catch the big bee
def add_bee(nil, %Bee{manager: nil} = report,
do bee
# and catch a kid bee trying to pass off as if it doesn't have a boss
def add_bee(nil, _not_boss), do: nil
# and catch a kid bee that has a boss and is a boss
def add_bee(%Bee{id: b_id} = b, %{boss: b} ... etc.
You must solve in a way that gets you closer to the termination condition
If you don't get recursion at first, spend more time on it in small cases.
In Elixir the small is the same as the big. Mix small projects with bit
Books and links to learn problems by
projecteuler
"Excercises for Programmers" by Brian Hogan
[The Little Schemer](https://mitpress.mit.edu/books/little-schemer) , Friedman and Felleisen.
# Code generation can be useful for productivity. Code generation, like drinking alcohol, is good in moderation
## Templates:
use templates to capture the commonalities
Phoenix uses this when you rails g...I mean mix phoenix.gen.html
lib/mix/tasks/<mything>.ex
inside module add use Mix.Task
add @shortdoc for mix help
@moduledoc for your mix help <task>
add run/1 with your code
Code Generation
#1 Template
An example of code generating tasks
Gen.stats.ex <!-- so each of your elixir projects can have this task and you would have project specific stats from the services you use
like github.com
1. Create your code first
2. Using an eex as a template that has "fillable" parts
3. Then use this template to generate code that is particular to each of the services you use.
Use EEx.eval_file("/filepath"), fill in your details.
When generating print out what you generated.
Generators in Phoenix are for education only:), write your own to do what you wish it did.
##2 Macros :: code transformations at compile time, the AST is tr
use is the same as requireing a macro and then calling it. That's what Phoenix does all over the place when including
functions
Use
quote do
def do()
unquote(thing)
end
Elixir expression can be quoted to show what the datastructure of that code does ex:
quote do
2+3
end
##3 Dynamic code compilation
Vision of AI in Lisp where code can compile code given data.
When generating code you need to prepend your module with "Elixir."
Elixir does this for you behind the scenes when generating Erlang code.
module_name = "Elixir.Adder"
quoted_code = quote do
def module(unquote(module_name))
def adder(x,y)
unquote(x) + unquote(y)
end
end
end
You quote your code and then you can compile it like this `Code.compile_quoted(quoted_code)` and then you can use
`Adder.adder(5,6)`
`Code.compile_string` awesomely adds compiled code from a string.
This is great for a screen sharing code execution scratchpad.
Case Study: VIV
This thing is a voice platform for us: no device tie in.
Viv dynamically generates code for every command that it processes.
Live demo of listening to an endpoint that gets the data from Alexa and generates code based on the interpreted text.
Ranom thought: Voice is coming soon. Node JS allowed front end developers to get into the backend code and feel awesome.
Voice control is going to allow backend developers to totally skip the front end and feel that it's awesome.
Customers are choosing to skip the "design" and layout phase.
WOAAH
Run programs and collect their output.
Combine this with genetic algorithms and generate code and measure it's output and adjust over time.
That code will be crappy but it will be more efficient but it will become more convoluted and complex as machines
adjust this code.
Distributed Architecture patterns
[Designing for Scalability with Erland/OTP](http://shop.oreilly.com/product/0636920024149.do) "authd" is the book code
Recommended reading: Dynamo paper from Amazon
* Fully Meshed
* Dynamo - break your request space by hashing i.e. also sharding in the DB land. And break them up into Nodes.
when a node is lost you can recover the data and processes in just that node.
Mentions Reac the database. Reac Core is a distributed framework that gives you this ability to manage and schedule
and rotate your application.
Your network will fail. You need ability to recover and become eventually consistent.
Goldman Sachs uses Reac Core and the Dyamo pattern for the job scheduling that then work with various services. A few million jobs are managed by 7 Nodes to manage the connections for the other thousands of messages.
* Service Bus
* Peer to Peer is the most scalable
Twitter only has 33,000 RPS
Whats App has only 100s of machines with Erlang and sends more messages than all of SMS combined
* SDErlang - scalable distributed Erlang.
largest is 50,000 Erlang nodes as far as the presenter is aware.
* Ad serving companies ~ 3000 nodes
* Designing APIs around this: standardize them, try to minimize the conversations between nodes, and try not to share data between nodes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment