Skip to content

Instantly share code, notes, and snippets.

@hzhou
hzhou / 170226.md
Created February 26, 2017 21:17
Why C has undefined behaviors?

Imagine you are Pinocho and your main goals are going to school every day. If we restrict the world to be only consisted of home, school, and the road in-between, Pinocho is never going to wander off. But on the other hand, Pinocho can't learn or do much either, and he will never have the adventure he had and learn the real meaning to be a good boy. Of course we want Pinocho to grow as much as possible. Here we have two approaches: one is to carefully design and expand Pinocho's world -- adding more roads and buildings and adding more characters. We may even consider adding some tricky but well controlled scenerios so Pinocho can expand his learning and experience in a steady pace. Or as a different approach, we could simply give Pinocho a few basics -- the ability to move around and the ability to communicate and above all, the ability to reason, then we simply expose Pinocho to the world without boundary. Of course we would suppose Pinocho would mostly going to school, and mostly interact with good people,

@hzhou
hzhou / 170210.md
Created February 10, 2017 17:37
Dellema of Types

The type system is supposed to help us, but like a sword, it has two sides.

Human works very well in context. We are very good at switch in and out of contexts. Within a contexts, we only keep track of several items, that is how we can be so efficient. However, as we slide in and out of context, we often forget about some details, that is why we are often encumbered with inconsistency. Computer, on the other hand, are best without context. Computers never forgets so it is always consistent. So this creates the mismatch -- when our understanding is different from what computer is understanding. This is the origin of bugs.

There are two ways to go about this. We can teach computer to learn our inconsistency -- e.g. fuzzy logic, neural networks, etc. Or, as we determine that consistency is more important (than efficiency), we can try to be consistent. The type system is essentially one of such systems to maintain consistency.

Then we realize that when we are in context, the assumed backgroud assumptions are o

@hzhou
hzhou / 170206.md
Last active February 6, 2017 19:14
J

J seems ideal for quick calculator pad. For that, we mostly are relying on its base primitive and standard packages. It is not much trouble to memorize and internalize idioms.

If we are going to accumulate our own set of libraries or develope complex systems, we are devoloping vocabularies and the library/code we develop is like the definitions to the vocabulary. In this case, I am not sure the succintness that is close to crypitic is virtue. When we use vocabulary, we don't really need carry definitions around. And when we need check definitions, don't we desire simpler more verbose descriptions than cryptic ones?

Since its main attraction is in the expressiveness, I don't understand why they don't investigate in a better visualization. It already defaults to an IDE-type of editor, so why not make the symbols and layout more look like math?

@hzhou
hzhou / 170110.md
Created January 10, 2017 13:56
Programming is like organizing

I find it a good analogy that programming is to organizations in one's office. There is really no single cut solutions. While every one likes a neat desk on the first glance, I know many people who keeps a messy desk and claiming to be most efficient for them that way. The modern day programming craze is quite like advocating organizing with boxes -- boxes inside boxes, and boxes with wires in and out all over the place. On the first glance -- looking at the main entrance -- it is very neat. It often looks nice on the second glance as we see how the major boxes are neatly labeled. But we discover its other side as soon as the first day we get to work.

While I am not advocating messy desks, I am commenting that organizing for neat appearances or any heuristic rules do not really work.

@hzhou
hzhou / 170107-2.md
Created January 7, 2017 19:40
Programming is all about understanding

Programming is all about understanding

We do not write our code randomly. When we program, we always have a mental understanding on how the code is supposed to do. We write code to reflect our intentions, and our understanding of our program is exactly that of our intentions.

The bugs are the reality that our program do not always match our understanding.

Bugs are almost unavoidable because the program become more complex, it is more difficult to comprehend.

There are two solutions to this understanding problem. One is to delegate. We use third party libraries and trust them to be bug free. This creates two additional problems. First, it is impossible to really understand a library that is hidden behind an abstraction layer. Second, the trust is never absolute. The libraries always contains bugs themselvs.

@hzhou
hzhou / 170107.md
Last active January 7, 2017 19:24
Imperative vs Declarative

How vs. What

Traditional programming is traditionally imperative. However, there is always this trend to break away from tradition. In programming, it is always fashionable to practice in so-called declarative programing.

In layman's term, imperative programming focuses on the "how" part of the problem, while declarative programming focuses on "what" part of the problem. That is, imperative approach tries to sketch out the path to solution, while declarative approach tries to transform the problem into a form that is already solved or decompose the problem into compoents that is already solved.

This differentiation is stupid.

Solving all problems requires two stages. The first stage is the analysis stage, during which we tries to understand the problem and sketch out and understand the solution in a analytical way. This stage can take place in one's head or on paper; or one may lay them out in certain frame work, e.g. flow charts, classes and interfaces, conditions and constraints. The second stage is

@hzhou
hzhou / 160413.md
Last active May 29, 2016 17:46
Indentation: Tab or Spaces

Code Indentation: Tab or Spaces?

When this question is raised, we often expect to choose from either using tab or spaces, and either answer has merit and in the end neither can be decisively chosen. As such, a general purpose editor often have options to allow both. However, in practice it often implicitly offers a third option -- a mixture of tabs and spaces. For example, vim explicitly list this as a feature -- it will intelligently replace spaces with tabs as we indent.

The purpose of this note is to make it clear that although either choice of tab or spaces as indent have merit, the mixture of tabs and spaces definitely has no merit at all!

So when we come to editor tab settings, we should really only deal with two types of options. One vs. the other, instead of one and the other (and more). The current situation in vim look like:

:set tabstop=4 softtabstop=4 shiftwidth=4 expandtab 

If we observe how us human communicate, we see a lot of guessing. Human languages are all ambiguous and their great efficiency depend on (intuitive) guesses -- inexpensive guesses. We make guesses as we listen/read, but we do not commit our guesses; rather we modify our guesses as we accumulate further information.

Now imagine if we forbid ourselves to modify our guesses, you'll see how understanding natural language a forbidding task ...

Compilation is essentially a guess of meaning with no room for modifications -- neither self-modifications nor post-hoc modifications. So it is doomed for inefficiency or un-trackable difficulties.

I believe the right way for the future of programming is multi-stage compilation -- with first meta-compilation followed with static compilation. Currently our compilation only refers to the latter. The meta compilation produces human readable intermediate result -- not much different from our current code in C or Java and it will be as easy to feedback from as we do today in

@hzhou
hzhou / 151218_2.md
Created December 18, 2015 14:45
Ideas behind MyDef vs. e.g. OOP

For people to comprehend, it is not sufficient to convey definitions and interfaces. Rather, we need to feel for it.

In modern practice such as OOP, it is all about definitions and interfaces and delieberately hide details (abstractions) from readers (including selves).

The idea behind MyDef is to make details accessible. A "subcode" is simply authentic code with no extra interface, so we see the code as real as it gets -- what we see is all out there -- no guesswork, we feel all of it. After we comprehended the code -- knows it somewhat intuitively -- we may refer to it simply by a name. That is exactly what MyDef let's you do.

@hzhou
hzhou / 151218.md
Created December 18, 2015 14:32
Getting to intuitive

It is not enough to just learn something by the text and even be able to pass any test. To really learn something, we have to familiarize it until it become intuitive.

What is intuitive?

Let's use an example, say "go to school". We may learn the definition of it, i.e. getting someone's physical body to a location of learning. And we may learn the interface of it, i.e. start at 7am, get out of the door, and expect around 7:15am be at school. However, try imagine yourself that you actually never did go to school and think about it -- you'll find that although you perfectly understand the definition and even the interface, you can't really feel it and therefore, remain non-ituitive.

What does it mean by "feel" it? Think about an object that you do have feel, e.g. your coffee mug. Think about it, you know your coffee mug not by definition nor the interface, rather you possess a rich store of sensational experience that when you close your eyes you can refer to to derive information. So to be intuitive, we don