Monthly Archives: February 2013

story about pencil marks

I have eventually found this book, second hand, obviously.

I have learned about it first because Creenshaw in his compiler book advised it as one of the best compiler books ever.

POINTER TO syntax appeared first in Modula.
Apparently the former owner of the book tried to translate the text from Pascal to Modula, or may be to Oberon.


There is no BEGIN in WHILE loops since Modula.
Modula and Oberon also don’t have the FUNCTION keyword, they have only PROCEDUREs (:

und so weiter

quote about debugging

from Wirth’s interview

well, of course, if the world was ideal I would gladly recommend learn the basic concepts properly and then do programming with Oberon. But I’m fully aware that the world is not that simple. Programmers nowadays are faced with very difficult task and they cannot afford to build system from scratch. They have to use many tools that exist already and interface their new programs with them, and that’s where the problem is. This interface is not even properly and fully specified. Any many work comes with that. It is quite well known that if 5% percent going to programming is much, the rest is going to debugging. And that’s precular, it’s not science, not even engineering, it’s just trying.

und so weiter

quote about old design

The libc is certainly not a good guide:

* Buffer overruns are not ruled out by design: gets, sscanf etc.
* Interface inconsistencies: gets vs fgets, fgets vs fscanf (note
the position of the file stream parameter)
* Bad interfaces like that of getchar() whose return code
can be a character or an error code
* Particularly bad buffering system which
– ignores the block structure of underlying file systems, and
– does not support bidirectional buffering
* No provisions exist such that independent libraries can cooperate
with each other in
– signal handling,
– setting up alarms, and
– tracking childs.

(Please note that I do not want to bash Ritchie, Kernighan etc. The
libc is history and should be taken as such… It is time to abandon
C and the libc and it does not help to place other systems on top of this historic relic.)

source
und so weiter

story about prokhanov’s interview

prokhanov

Russian version of the “Yerevan” magazine, first issue of 2013 year, published interview with the Russian writer Prokhanov about the Eurasian Union.
He states, that he cannot imagine Russia other than an empire and does not conceal that he considers the Eurasian Union as a new implementation of this empire. This guy’s cynicism is is pinning.
I am quoting:

Eurasian area with it’s ethnicities is a colossal resource. The area by itself is already a resource. And combining these squares into one creates colossal wealth.

Undoubtedly, area is resource. If it weren’t resource, Azerbaijan would give up Artsakh long time ago. Also, when US spreads democracy the question rises – for whom? And the same question rises now – wealth for whom? The answer follows.

When the Soviet Union collapsed, Russia lost uranium mines of Kazakhstan, cotton of the Uzbekistan… It is obvious that there is a huge economical potential in this project.

Okay, Russia wants free uranium, and other resources. Then – wealth is for Russia.
Upon the whole Prokhanov observes both USSR and Eurasian Union as different forms of Russian Empire.

Russian history – is the history of empires, and Russian tragedies always connected with their disappearance. At least, four great empires (including the last one, Red empire) were demolished, and as a result Russia suffered colossal losses.

Wow, so sincere.
Then, may be empires shell not live by definition? May be they have to die naturally with time?
Isn’t it enough to make the same mistakes, Messrs Russian imperialists?
May you understand that it won’t bring the future closer?

Let’s continue. Prokhanov observes Eurasian Union as it is in reality, not as it is described in advertisements. People get deceived by the “customs union” expression. Though the real Eurasian Union is the new reincarnation of Russian Empire, not the customs union of independent states. This customs union could be created on the basis of Union of Independent States. However Eurasian Union is one whole state.

We already have conceived union state between Russia and Belarus, it has difficulties, but the gene is found.

Meanwhile Prokhanov expresses what kind of state it will be.

The civilization product – it’s not only technologies of the future, it’s also ideas of the future. And these ideas, ideologies already today are the most precious possessions – in Eurasian areas may become an alternative to the dying models, such as western liberal model.

Obviosly, my comment is that those are not ideas of the future, but of the past. New ideas are spoken in the Occupy movement, for example, but definitely not in Russia.
And despite the fact that the answer to the next question – doesn’t it mean, that this is a reestablishment of the USSR, he answers – no, USSR cannot be recovered, he also means that this is just another way to create a new Russian Empire, Russian Empire reloaded, the new one, not the old.
109281_600
And I agree, Soviet Union at least was Soviet to some extent, and they were trying to support some provinces, meanwhile working on the assimilation of the native population, and changing demographic composition (simply – by settling Russians, the main and base ethnicity of the empire in republics). So the new union won’t even support republics, it will have the same cruel capitalistic model, and the goal is to soak up resources from those republics. We see widespread pillages in Russia, but it’s not enough for them, they need more, cause they are fat and hungry.

And we just “elected” our new president. President, who got this title after not really clear events. President, who thanked Russian leaders for support after those not really clear events. President, who presents the regime, which copy-pastes Russian Federation movements, president who recently signed a paper related to that Eurasian Union, and what now? We have to wait with gladness until he makes another sign, by which Republic of Armenia will formally acknowledge it’s failure, then will be painted out with red or pink color on the map becoming a part of the stupid big state.
108964_600
photo from here
Why Russian language Yerevan magazine prints this interview remains an open question for me. Then, may be it’s not surprising that in the previous issue there were photos of the prime minister with combine, then president at the exhibition, and so on.

Congratulations, gentlemen. Congratulations. Everything is excellent, let’s enjoy festal fireworks.

und so weiter

story about the Dell Ubuntu notebook

I have advised my parents to buy a Dell laptop with preinstalled Ubuntu. They went to the nearest Dell shop in Yerevan, and asked the seller about the laptop I have showed them in the Internet, on a local Dell distributor web site (Dell Inspiron N5050).
The seller said:
– I don’t think you want this.
– Why? – asked my father.
– Because it’s Linux.
– And?
The seller probably thought that they don’t have an idea what the word “Linux” means. He started explaining:
– This means it’s hard to use, and you won’t have many games, programs you are used to.
– We use Linux at home – my father tried to set the seller’s mind at rest.
However I would ask him:
- Is this an official advice of Dell, to not by the Ubuntu notebooks produced by Dell?

Okay, so they brought this notebook home, were able to connect to their wifi, and read web pages.
However, they said they have troubles with games: trigger didn’t run smooth, and the laptop were hung from time to time.
I went to their flat in order to see what’s going on with that notebook.
First of all, it was Ubuntu 10.10. Really old. I don’t know why did Dell setup this old version, may be because it’s LTS.
Secondly, I cannot understand what is the reason that preinstalled Ubuntu was 32bit on a 64bit hardware. What I can think of, is may, be the flash plugin was the reason: they decided to use 32bit plugin, instead of using nsplugin wrapper to use the same 32bit plugin in 64bit system, or instead of using native 64bit plugin.
Now I need to tell that my perception is, that Adobe/Macromedia flash is obviously not written well, cause they were porting it to 64bit long since. And for years they are still unable to accomplish this task. I believe this example defines proprietary software: it is very often written in non-portable, I would say, ugly way. Another example is Skype – there is no native 64 bit version yet, it just works by using 32bit emulation feature of modern x86_64 cpus.
And then I cannot resist to not tell, that making decisions because of having dependency on such software, such as Adobe Flash, or Skype – proprietary, ugly written software – is not a good idea. Why would someone need to run x86 system on x86_64 hardware? I was using 64bit GNU/Linux since 2005, and I never have experienced problems with that hardware. The only problems I can imagine are because of proprietary software which the software vendor cannot port to the new platform.
I didn’t use Flash or Skype. I didn’t use Adobe Reader – we have Evince.
So choosing x86 OS instead of x86_64 because of Flash means using crutches in design. Clean design does not need crutches.

However, Ubuntu is an operating system which needs market, which tries to follow customer needs. Dell is a hardware vendor, which also works for market. And the typical consumer wants Flash, and wants Skype. That explains why Dell notebook with preinstalled Ubuntu contains that Skype icon.
Customer wants Skype, because it follows what marketing guys want them to follow.
And Dell/Ubuntu follows customer needs, which means they follow what Skype/Adobe marketing experts need.

However, this still does not explain why Dell guys had to choose that old version of Ubuntu, where Intel video drivers just don’t work out of the box on the modern hardware they have distributed this software with. They sell notebook with operating system which does not support it’s hardware. Even after updates of that LTS version.
So hardware video acceleration didn’t work.

Also, running few programs caused environment to hung – it was possible to move mouse pointer, but it was not possible to click, interact with buttons, elements on the screen, they did not response.

Ancient Firefox 3, which was also preinstalled was not able to show any modern web page. I had to manually download and install the modern version of Firefox, by knowing how hard would it be to explain to my parents how to do upgrade the Firefox manually without me. Previously, in Debian, they have used updates suggested by the operating system, but if I install Firefox not with the help of package manager, then they have to update it by hand too.

I didn’t have an exact explanation why the environment hungs, however my guess was that it may be connected to the old kernel, which means old drivers incompatibility with modern hardware. Probably, Intel video drivers are not the only incompatible part of the system.

Another problem I dealt with was that Ubuntu suggested to update the wireless driver. I don’t remember, whether I have chosen “restricted drivers” application, or the suggestion appeared on the screen by itself. However, I was sure, that Dell and Ubuntu already tested the driver suggested by the OS, and that it is safe to use that suggested driver.

I agreed. My guess was that system becomes not responsive because network hungs, and network hungs because of the WiFi driver issues.
After the wireless driver change, which was done by that Ubuntu restricted drivers utility, wireless network disappeared. I have checked, there were no wlan interface shown with ifconfig. Short search in the web pointed, that Ubuntu utility may have blacklisted the module originally installed so it won’t load, and I have tried to edit the blacklisted modules file in order to uncomment it.
During the next reboot Ubuntu alerted that there is an important change in configuration files, and suggested to reinstall the system. I have agreed, cause I was tired, and then the installer erased not only root but also user’s home, because the whole system was installed on one partition, without separating /home as separate entity in partition table.

How it is happened that Dell approved the sell of the notebook with the system which does not support the hardware well?
I don’t know – whether they were not serious about it, and selling Ubuntu just means they believe nobody will use it, and that’s a way of selling the same notebooks cheaper, so that future users can install pirated Windows instead of the default Ubuntu. Or, may be, they work without attention to details in Dell, and their quality assurance just did not notice the problems I described.
Obviously Dell QA guys were not using Ubuntu on that notebook long enough – it is necessary just to use it for half an hour to see what happens.

Anyway, I don’t use Ubuntu. I am a Gentoo user, and that means I like when the system does not hide something from me behind the wrapper scripts or bells and whistles. I believe, Ubuntu is not a convenient system because it assumes that I am stupid, I don’t know anything, and it suggests to use it’s tools in order to configure something. This hides the underlying processes from the user, which means this complicates understanding of what is going on when you press this or that button. It also complicates manual changes if necessary – whether manual changes are not obvious and hidden behind Ubuntu specific wrappers, or system alerts that config files were changed, and asks to make a suicide reinstall itself.
What I have done?
I just ended up downloading and installing x86_64 version of Debian Wheezy. Graphics worked like a charm, as far as I remember, I had to download manually the firmware for the ethernet, and also, there were no free wireless driver yet, and I had to find a Debian tarball and build the proprietary wireless driver by hand against my kernel. That was easy, I have done it in 10 minutes.
The only customizations to the default Debian I have made for parents were:
– added MATE repo, in order to provide them convenient desktop experience they had previously with GNOME 2,
– and another repo – with 64bit version Google Chrome with integrated Flash plugin. Unfortunately, my parents still need Flash.

Anyway, it was easy and clean with Debian, it didn’t take much time, and it’s a pity I spent an evening before by fighting Ubuntu version which came with this laptop, i. e. because of Dell’s carelessness.

I believe Dell could have done it better, install more modern version of Ubuntu, have better, more careful quality assurance, i. e. at least check all the drivers, and then I wouldn’t need to change the system or tinker with default Ubuntu. That would predict and eliminate most of the user problems.
On the contrary they have sold the notebook with obvious problems out of the box. It is not surprising if most of unexperienced users would just think that Linux is a raw unstable crap and won’t consider to use it again.

It is obvious that the notebook out of the box was not usable, and that unexperienced user would be unable to solve those issues herself.
Does Dell support cover this problems? I don’t know, I was sure that I can solve my problems better than the local (and actually, not local) Dell support. And I am almost sure that if anyone calls to the local Dell support with those Ubuntu problems, caused by Dell’s negligence, they (support) would just suggest to install pirated Windows. And I would ask them, is it an official answer of Dell – just to install pirated software?

And how can we call Dell after that? How can we wonder why Linux on Desktop is still behind other operating systems?
und so weiter

story about linking Pascal and Oberon.

I prefer to write programs in Oberon when it does not require too many extra efforts. It requires too many extra efforts simply because it is not widely used, and this often leads to the community is fragmented and small. This explains why we don’t have the huge code base of Oberon sources, and the regular Oberon developer always has challenges, regular Python programmer does not have. Oberon developer needs to change existing libraries in order to compile them with different compilers, or may be prepare wrappers in order to compile those libraries. It is often necessary to prepare bindings to C code, or just translate some code by hand, just because no one before have dote it.

I don’t have yet good skills in programming graphical interfaces in Oberon, however you can find an example of GTK program at my github page.

I was using Lazarus/FreePascal to develop graphical applications, and it has one clear and important advantage for me – interface does not depend on the backend widget suite. For instance, I have recompiled the same program so that it would draw itself by using Qt, or GTK, or even native WinAPI on Windows.

That lead me to the idea to combine some Pascal and Oberon code together, and to call Oberon code handlers from the LCL application. Oberon compiler I used, ooc, has a C backend, so the stary begins.

Let’s write a minimal module with one exported function in Oberon:

MODULE m; PROCEDURE add*(a, b : INTEGER): INTEGER; BEGIN RETURN a + b; END add; END m.

We even can compile it now with oo2c

$ oo2c m.Mod

and get directories obj and sym.
We may now compile obj/m.c
and get an object m.o

gcc -Iobj -I/local/oo2c/lib/oo2c/src -I/local/oo2c/lib/oo2c/obj -c obj/m.c

Now we need a pascal wrapper to bind it.

Let’s call it mbind.pas

unit mbind; {$link m.o} (*oo2c objs to link*) {$link RT0.o} {$link Object.o} {$link Exception.o} {$link HashCode.o} {$linklib c} interface uses CTypes; function m__add(a, b : ctypes.cint16) : ctypes.cint16; cdecl; external; (* /local/oo2c/lib/oo2c/obj/C.oh typedef OOC_INT16 C__shortint; *) implementation end.

We need to tell compiler explicitly which object files to link.
m.o is the object file we get by compiling obj/m.c
Other files are the minimal oo2c rtl which needs to be linked.
We also need to link to libc, because oo2c works by compiling via C.
Function name is m__add because oo2c translates m.add to C as m__add
Note, that implementation section is empty, and fpc does not issue an error because the function marked as external.

Eventually, let’s write a main module to call the Oberon function.

program m0; uses mbind; var i : integer; begin i := m__add(20, 3); writeln (i); end.

Neither fpc nor oo2c do not need a make file.
FreePascal compiler, as well as Oberon can itself resolve module dependencies.
For example, I do not need to write a makefile where I compile first mbind.pas and get mbind.o and link it during the next step.

However, in this case I’d like to write a make file.

IPATH = -Iobj -I/local/oo2c/lib/oo2c/src -I/local/oo2c/lib/oo2c/obj LPATH = /local/oo2c/lib/oo2c/obj GCC = /usr/bin/gcc OO2C = /local/oo2c/bin/oo2c FPC = /local/fpc/bin/fpc all: $(OO2C) m.Mod $(GCC) $(IPATH) -c obj/m.c $(FPC) -Fo$(LPATH) -Fo. m0.pas clean: rm *.o rm *.ppu

I usually use custom fpc I compiled from subversion, and I prefer to keep my software in /local
LPATH is the path where oo2c main rtl objects are located, so fpc can find and link RT0.o, Exceptions.o, Object.o and HashCode.o.

So, first oo2c compiles the m module to c.
Then gcc compiles it to object file.
Then fpc compiles the main module.

And now enjoy:

$ ./m0 23

und so weiter