Showing posts with label ubuntu. Show all posts
Showing posts with label ubuntu. Show all posts

Friday, April 19, 2013

Friday Link List -- Ubuntu 13.04 Backend Updates and Good News for the Web

Real Dev News for Ubuntu 13.04

This is way better than the stuff you'll find on OMGUbuntu!
  1. gvfs updates with MTP Support (you can now connect your Android 4 device and it just works)
  2. LibreOffice 4.0; a huge number of bugfixes went in to this
  3. cups updates to now auto-detect shared printers on your network, and auto-share them with other devices including iPads/iPhones etc.
  4. Much of the backend has been ported to Python 3 in hopes of totally removing Python 2.
  5. Lots of work has gone in to sandboxing applications from one another (most likely due to the phone release)
  6. A potential new feature (if it makes it in time) is automatic removal of old kernels, which can take up a few gigs of space over the course of six months if not cleaned manually.
  7. New upstart version allowing userspace (beta) jobs, and launch on file/folder change, hopefully meaning fewer background services and less clunky code.
  8. Fixes for quite a few RAM hogs that persist across sessions.

Web

  • jQuery 2.0 has been released it drops support for IE 6-8, it looks like old Android (2x) is going to get the chop next.
  • Chrome forked WebKit to form Blink, they say they were able to drop about 2.5 million lines of code right off the bat, hopefully it'll speed up their iteration time.
  • Firefox has released it's baseline compiler tl;dr version, it cuts down on complexity, and does a much better job of supporting IonMonkey being it's based on the same backend giving a huge speed boost.
  • ASM.js promises a good future for porting C/C++ apps to the browser with support from Firefox, and Chrome in the works.

Wednesday, February 6, 2013

HOWTO Fix Flash in Chrome + Ubuntu 12.10

After a recent Chrome update, I was left without flash for this browser, which by the way handles Pandora way better than Firefox does on the same platform.
A quick fix is all that is needed:
  1. Type chrome://plugins/ in to your URL bar.
  2. At the upper right, press "Details"
  3. Scroll down to the Adobe Flash Player entry, there should be two sub-entries, these are versions of Flash Chrome can use.
  4. Disable the entry that looks like this (should be the first one):
    Name:   Shockwave Flash
    Location: /home/user/.config/google-chrome/PepperFlash/11.5.31.138/libpepflashplayer.so
    Type: PPAPI (out-of-process)

    MIME types:
    MIME type Description File extensions
    application/x-shockwave-flash Shockwave Flash .swf
    application/futuresplash Shockwave Flash .spl

  5. Restart the browser.
For some reason the flashplayer using the pepper API appears not to be working, so you'll just have to stick with the one that was installed to your platform through the package manager (a safer bet anyway).

Wednesday, January 30, 2013

HOWTO Debug Crashes in C/C++ Applications on Ubuntu

In this howto we'll cover:
  • Compiling C/C++ code for debugging
  • Allowing debugging
  • Viewing errors
  • Fixing common gdb issues
Your computer is happily humming along and your program is progressing fine, suddenly disaster strikes! Segmentation fault (core dumped) your computer yells before it plunges back to darkness and your friendly shell prompt reappears.

The way of debugging in Ubuntu when coding by hand is not nearly as nice as popping up an IDE, but when done right can be much faster.

Our Problematic Code

For demonstration we'll be using this bit of code that is written to cause crashes.
int main()
{
int* p = 0x0000007b; // The cause of many a Windows XP BSOD
int j;

for(j = 0; j < 10000000; j++)
{
p++;
*p = j;
}

return 1;
}
In order to get the best debugging results you'll have to compile your code with the -ggdb flag, in this instance: gcc -ggdb killer.c.
If you run this program, it will die nearly right away with the error: Segmentation fault (core dumped).

The -ggdb flag to the compiler instructs it to include lots of debugging information. However, you won't want this on production executables because it takes up a lot of space. This example program was 9.6K with symbols, and 6.2K without!

Enabling the Dump

By default, Ubuntu 12.10 won't output crash information for programs you make yourself, to fix this you'll need to run the command:
ulimit -c unlimited
This will need to be run every time you log back on to your system.

Debugging

Once you have logging enabled and your program crashes a file called core should appear in the directory from which you ran the program. Use the gdb command to see what information it contains about the crash.
gdb a.out core
Where a.out is the name of your program that crashed and created the core file.

Analyzing the Debug Output

GNU gdb (GDB) 7.5-ubuntu
Copyright (C) 2012 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-linux-gnu".
For bug reporting instructions, please see:
<http://www.gnu.org/software/gdb/bugs/>...
Reading symbols from /home/joseph/Desktop/a.out...done.

[New LWP 13663]

warning: Can't read pathname for load map: Input/output error.
Core was generated by `./a.out'.
Program terminated with signal 11, Segmentation fault.
#0 0x00000000004004ed in main () at breakme.c:9
9 *p = j;
This is the output you'll get from the gdb command, all you want is the last three lines, from the top they tell you:
  1. What happened.
  2. Where the error happened (in main in file breakme.c on line 9)
  3. What the line was.

Common gdb Errors

  • warning: exec file is newer than core file. means the core file you're using wasn't made by your executable, you'll need to delete it and run your program again.
  • warning: core file may not match specified executable file. means the core file you're running against probably wasn't made by the program you specified, gdb will probably report wrong results.

Monday, January 9, 2012

UNIX Utilities III: Yes

This isn't a hugely popular program, but it does have its uses: the basic "yes" command, simply repeats whatever you put in to it, over and over and over again. This can be handy for things like testing buffering on a new shell you're developing.

Note that the usage is incredibly similar to echo, but simply provides it in a for loop:

/**
* A utility that emulates the UNIX command "yes", repeating the input
* until killed.
*
* Used in places like Jurassic Park (the movie) when Nedry leaves a
* trap that outputs the string "You didn't say the magic word!" over
* and over.
*
* Copyright 2011-12-23 Joseph Lewis <joehms22@gmail.com>
*/

#include <iostream>

using namespace std;

int main(int nargs, char* vargs[])
{
string s;

for(int i = 0; i < nargs - 1; i++)
{
if(i != 0)
s += " ";
s += vargs[i + 1];
}
s += "\n";

while(true)
{
cout << s;
}
}

Saturday, January 7, 2012

UNIX Utilities II: True and False

It is occasionally useful to check the outputs of commands in bash scripts; it is even more useful to test your scripts for all eventualities before you release them, to make sure strange errors don't begin occurring once they have been run in a variety of environments; this is where "true" and "false" come in.

These two simple programs simply return 0, or 1 as an exit status when run. This is probably the only case in a UNIX system where you will see 1 denoting a success.

Once compiled, these programs end up being about 8.3 kb on my system, which is almost a third the size of the GNU versions, how this happened, I have no idea; they are in essence the simplest programs in the world:

/**
* true - Probably the simplest program in the world, does nothing, and
* succeeds at it. Not very realistic, I know, but this is UNIX after
* all, not the real world where not doing anything makes you a failure.
*
* Copyright 2011-12-23 Joseph Lewis <joehms22@gmail.com>
*/

int main()
{
return 0;
}

/**
* false
* Probably the (second) simplest program in the world, does nothing,
* and fails at it. Note that this is the second because "true" is the
* first.
*
* Copyright 2011-12-23 Joseph Lewis <joehms22@gmail.com>
*/

int main()
{
return 1; // FAIL
}

Friday, December 2, 2011

Choosing Your Next Webserver

Being an occasional web developer, I like to be able to debug on my local machine. My old friend Apache just wasn't doing it anymore however, because it made my computer boot incredibly slow, I didn't need the power, and the configuration was giving me a headache.

I went in to Synaptic Package Manager and pulled out three others (nginx which I had heard good things about, Cherokee which seemed to have a bunch of packages, and nanoweb a PHP based server that I had never heard of).

Apache
I used gnome-system-monitor for profiling each of the programs as they were running, Apache had 3 threads/child processes running at 7Mb and 4 at 5Mb. That, along with a hefty and often confusing configuration is why I dropped it.

The amount of RAM to beat then was 41 Mb.


Nginx (Engine X)
I first tried a server that I had been dying to give a go for quite some time, Nginx, it runs popular sites such as http://dearblankpleaseblank.com and according to its online site excels at serving static pages.

The configuration didn't really seem like anything I had dealt with before, it was kind of like JSON, kind of like INI and took quite a bit to understand, the online manual wasn't much help and the examples in the file were minimal.

I wasn't able to get PHP running with it even after following a few tutorials. I'm sure you can, but I decided to move on.

The rest of the directory structure was kind of like Apache, with the same sites_available folders and such. However the default site was in /usr/ somewhere rather than /var/www, bizarre.

While it was running it had 5 threads/child processes running at about 2Mb a piece, 10Mb, not bad.


nanoweb
Next up I installed "nanoweb" a simple webserver written in PHP5. After rebooting I found that the default page provided some help on getting things configured, and there was some kind of configuration library that was available as a secondary package, although this would have also been available outside of localhost, and therefore exposed your server to threats.

I took a look at the configuration files and found three, much nicer than Apache. One seemed to be for CGI, I didn't have a reason to look at that, but I was able to see it had support for vhosts (in a very simple ini style file, it looked like each vhost would have taken something like five lines to set up).

I configured my site through the main file replacing all of the default "It Works" style site's values with mine. A reboot worked flawlessly and I was able to start working on my site right away. The file had lots of documentation and tips for configuration.

Using my profiling tool I saw that nanoweb used 3 threds/subprocesses with an average of 7Mb overhead each, undoubtedly due to PHP being a scripted language. 21Mb, not as good as nginx, but not as bad as Apache.

Cherokee
I didn't want to give up the nice nanoweb configuration I had, but I had read some things about it and I wanted to try it.

The install went really fast, unlike nanoweb that took about 20Mb of packages Cherokee must have taken more like four. Probably due to it's base requirements being no more than the C standard library.

I found that to configure you had to run a utility called "cherokee-admin" once run it gives you a username, temporary password, and opens a server on port 9090 until you close the admin interface.

Once logged in the GUI is amazingly nice, giving a nice profile of your machine, easy access to all virtual hosts, and wizards to set up most things. Enabling PHP on my site took four clicks, and it seems that most things are configured for security. Even applying the changes was nearly instantaneous as the web-server can do a restart from the interface.

While running the whole thing has two threads of 2Mb each, 4Mb.

Sunday, October 16, 2011

Ubuntu 11.10

Whenever a new OS comes out, there are always improvements, but lots of compromises. After trying out Ubuntu 11.10 I'm, quite frankly, disappointed.

The Good
The upgrade was painless, the boot time is much improved, and the lightDM looks beautiful.

The Bad
Grub has failed on me four times in two days, I have to hit the power button and start again to get a non-blinky cursor.

In two days of use, my mouse cursor has frozen on screen twice, I can't move it afterwards.

The new software center only allows you to install one piece of software before searching for another.

Unity (still) doesn't allow me to drag and drop shortcuts to my folders to the dock.

The Ugly
Nautilus (if you still are nautilus beneath your new exterior) doesn't show breadcrumbs unless in full screen.

The menu in the upper right of the screen is excessivly large for people like me that never use the person switcher, a desktop email client, or social networks outside our browsers.

You can't uninstall zeitgeist without completely removing the ability to launch applications.

After installing compiz-config-settings manager to try and resize the dock to be smaller, and to show up immediately when I mouse over to it (so it doesn't destroy my workflow) I couldn't, compiz also disabled by alt+tab, and my Aero snap.

With the newest gnome desktop build (ubuntu-classic) the desktop looks like crap, lots of vertical lines, and no configuration for how the layout is done (i.e. I want to remove the bottom panel and install docky without going to gconf, that would be fine).

How to fix it
Install xfce or KDE (yes, even I am willing to switch to KDE after

Tuesday, April 5, 2011

Ubuntu Firewall Alerts

Notification of an attempted connection.
At the university I study at, every machine is assigned a net-accessible IP address; as you can imagine this is immensely useful as you don't need to worry about trying to bypass firewalls when you need to SSH somewhere.  There is a catch though, save for torrents (including LiveCDs that take hours to download manually, but minutes by torrent) there is no firewall.

That is okay by me though, I use Ubuntu; meaning I don't get viruses, and the only port I have remotely open is for the IPP because the network is Windows.  Recently I have wanted to see all of the garbage that is coming in.  I originally hacked something up in nc, but decided that wasn't good enough, here is my solution, that includes notifying you when someone attempts to connect to your port (this is a feature missing in all of the Linux firewalls I have found and seems to be a common complaint)  the finished product will look something like the photo above.

Ingredients:
  • Ubuntu / Distro of your choice.
  • iptables (installed by default in Ubuntu 10.10)
  • gufw (sudo apt-get install gufw)
  • notify-send / espeak / xmessage / zenity / other communication interface

Instructions:
  1. Install all of the above.
  2. Under System > Administration > Firewall Configuration, set Incoming to Reject; and turn on the Firewall.
  3. Copy the shell script below to your machine:
  4. #!/bin/bash

    lastlog=$(dmesg | grep UFW\ BLOCK | tail -1)

    while [ 1 -gt 0 ]; do

    sleep 1

    curlog=$(dmesg | grep UFW\ BLOCK | tail -1)

    if [ "$curlog" != "$lastlog" ]; then
    #get information.
    ip=$( echo $curlog | cut -d = -f5 | cut -d \ -f1)
    port=$( echo $curlog | cut -d = -f14 | cut -d \ -f1)
    portfrom=$( echo $curlog | cut -d = -f13 | cut -d \ -f1)
    lastlog=$(dmesg | grep UFW\ BLOCK | tail -1)

    #send message.
    notify-send "Src: $ip:$portfrom Dest: $port" -u critical -i security-low
    fi

    done
  5. For this to work though, you will need the program notify-send, if it is not installed, you could replace it with espeak (to have your computer announce that you dropped a connection), xmessage, or zenity.
  6. Watch how many times you are attacked.  (You might want to consider posting/looking up your findings to dshield)