Stand With Ukraine

Friday, November 6, 2015

Wifi connection from a command line (OpenSuse 13.2)

My 10 month SSD with Fedora stopped working recently and I had to put back in an old HDD with OpenSuse on it. Since it was awhile since I've last used it, I decided to update all the packages with zypper... So it went and upgraded the kernel... But that broke my nvidia drivers and I've ended up in the command line. That would be OK, if I could use the internet to get the driver and compile it for the upgraded kernel (to lazy to use usb). At this point, I've decided that time has come to eventually learn how to manage my wireless using command line tools. After some poking and googling I've found the utility called 'nmcli' (luckily it was already installed). And the command seemed pretty simple:
nmcli d wifi connect [ssid] password '[pass]'
But that, of course, did not want to work giving me the error messages like:
Error: Connection activation failed: (7) Secrets were required, but not provided.
And for some reason I've decided to do
killall wpa_supplicant
And after this the nmcli command above worked and I am online again. And I did not have to redo anything after reboot, which is very nice... I am not sure if 'killall wpa_supplicant' was required for a good reason or just because I was launching it following other tutorials... I've decided to log it here in case if, hopefully, this could save somebody (or even myself) from frustration in future. Cheers

Friday, October 16, 2015

Fedora 22 Gnome, motivation screenshot

This is the screenshot of my current system at home... Looks very nice. The theme is EvoPop, cursors: breeze_theme, icons are also evopop... The theme is installed through Fedy and set with Tweak tool.

Friday, July 10, 2015

Q4OS - motivation screenshot

I have changed a cooler in my lenovo laptop (which is very old T61p that I have had since 2008). And it really helped the performance. So I decided to stress test it with a virtual system. For a long time I was contemplating to test the Q4OS, which is a minimalist system. So here is the first screenshot from the current virtualbox installation.
I have only good things to say about it for now. It proposes and installs Google Chrome browser right from the start. "apt-get install" seems to work well.

Monday, March 2, 2015

Linux: execute a command periodically and watch the result

I am aware of cron and it does a great job when you want it to do something for you weekly, daily or with whatever frequency. But you have to configure it and stop it, and so on and so on. But what if you just want to quickly see something to start happening or the dynamics of something that is already happening and then just move on (for example new files start appearing in the folder). I needed this functionality because a model, that I use, stores the last executed time step in a text file and updates it after each time step (updates, so 'tail -f fname' does not help). What I was doing is executing 'cat fname' million times and frequently until I have found out about the 'watch -n 1 cat fname' (http://en.wikipedia.org/wiki/Watch_(Unix)), which does it for me every second and nicely updates the result. So I decided to write it here for my (and for anyone's) reference.

Friday, February 6, 2015

A python script: example for downloading GlobSnow data (snow extent in this case)

Recently a friend asked me if it is possible to selectively download folders from http servers in linux.. Probably it is possible by using wget or rsync, but I have never succeeded to make them work exactly the way I needed. So I wrote a small script for the task and passed it to him hoping that this might be his first step to learning Python. And below are 2 versions of the same script:
  1. The version I have actually given to my friend,
  2. The improved a bit more scary-looking version, which is a bit closer to the way I think it should be written.

First, below I show a quick and simple way of downloading files, with a minimal account for possible errors. The next step is to check if the sizes of existing files are the same as the sizes of the remote files and to replace the bad files, if required. Of course, it takes some time to download the data, especially if you need several years. If you work remotely I would suggest using tmux or screen, so your program would continue running even if the ssh session is closed for some reason. But if those are not installed, you still can get away by using nohup as follows:
nohup pyhton download.py >& log.txt &

Cheers and any comments are welcome