srchub.org and heartbleed

09 April, 2014

If you haven't heard already heartbleed is a bug in OpenSSL that could allow an attacker to acquire the private certificate keys and decrypt previous or future communications. You can view more information here.

srchub.org runs Debian and the folks at Debian released a patch almost immediately fixing this issue.

srchub.org server was updated last night and a new certificate was installed this evening that was generated using a new CSR/private key.

New srchub features

06 April, 2014

There has been a lot of activity at srchub with new features.

New syntaxhighlighter is in place and being used for source preview. This was needed because the old source preview would highlight one line at a time which caused multi-line comments to be rendered incorrectly.

You now have the ability to select a different theme if you decide to use the new syntax highlighter for code in your wiki or project front page.

Example:

Using the Django theme:

At srchub we believe that any data you upload is your data and you shouldn't be trapped to a single service provider. Especially if that provider decides to sail off into the sunset...with your data. A new "beta" feature was developed that would allow you to download your project contents in JSON format.

The JSON file contains:

  • Wiki (and revisions)
  • Files
  • Reviews

It does not contain source code - though that can be downloaded separately in it's entirety from hg/git. If you have a subversion repo and would like an offline copy you may contact an admin.

Finally we have made RSS feeds more prominent throughout the site to make sure you can use the RSS feeds in other applications or aggregate them into a single feed to show users what you are up to.

Samsung Galaxy Note 3

06 April, 2014

Recently I purchased a Samsung Galaxy Note 3 and before I did anything to it I acquired root. This wasn't for nefarious reasons but it was mainly due to the fact that one time password generator programs I use do not have a built in sync option. The only thing I can do is use titanium backup to backup the application and data and use titanium backup to restore the app on the new phone. The operation on both phones requires root. Also, I use titanium backup to freeze the bloat that comes on the phone especially applications I would never use and as a result I can hide from getting future updates.

After I got my phone all setup and transfered all my applications I went to attempt to update and received this error message:

Naturally I was furious - what kind of greedy company would do something like that? The proper message to show is "look, it seems like you modified the device, this update may break your device. Are you really sure you want to do this?" Nope, instead they use typical DRM tactics because this message also appears for people who HAVE NOT modified their device. That's right, even the "honest" people who do not use root or anything have gotten this message. It appears that a workaround is to remove your SD card or unmount and try updating again. Yeah, having an SD card to Samsung means you "modified" your device and as a result you just voided your warranty. Using the device as it was intended and you void your warranty - brilliant thinking from Samsung.

In any case, there is a way to "fix" this.

This is what my about page looked like:

To fix it just use the steps mentioned in this post:

  1. Install Xposed Framework (a.k.a. Xposed Installer)
  2. Tap on "Framework", then do it on "Install/Update"
  3. Install Wanam Xposed
  4. Open Xposed Installer, tap on "Modules" and check Wanam Xposed
  5. Reboot the phone
  6. Open Wanam Xposed, go to "Security Hacks"
  7. Check (on System) "Fake System Status"
  8. Reboot again...
  9. Voilá, you can go back to updating OTA

Now this is what my about page looks like:

Stackoverflow comment reply 2

06 February, 2014

http://programmers.stackexchange.com/questions/176435/why-does-facebook-convert-php-code-to-c/176440?noredirect=1#comment453492_176440

Might that have something to do with why Doom 3 was a terrible game?

Metacritic would have to disagree with you. How about the following snippets from Wikipedia - which has linked sources?

Doom 3 was a critical and commercial success for id Software; by the beginning of 2007, over 3.5 million copies of Doom 3 had been sold, making it the most successful project by id Software to date.

or

Doom 3 received a favorable reception from critics...Much praise was given to the quality of Doom 3's graphics and presentation

DeadMG writes

Also, last I checked, they are an evolution of the same codebase.

Per wikipedia

id Tech 4 began as an enhancement to id Tech 3. During development, it was initially just a complete rewrite of the engine's renderer, while still retaining other subsystems, such as file access, and memory management. The decision to switch from C to the C++ programming language necessitated a restructuring and rewrite of the rest of the engine; today, while id Tech 4 contains code from id Tech 3, much of it has been rewritten.

I don't consider an engine that is mostly rewritten to be an evolution of the same codebase.

What everyone considers "go-to" is irrelevant to your needs, and what Facebook don't need to do is spend billions of dollars re-writing existing debugged code that their developers have experience in.

This was in response to my argument that at some point a project should be rewritten from scratch. I still stand by that point and I wouldn't doubt a project like facebook has gone through many iterations.

In fact it did - Facebook was originally entirely PHP then they created an in house compiler called HipHop for PHP which compiled PHP applications to C++. This project was then deprecated in early 2013.

My point being is that for large monolithic projects there should be an iterative process of seeing if the project can be made better by either introducing new tech or switching languages/frameworks. Of course, this isn't a light decision to be made on a whim and shouldn't be made often but there is something to be said about "turd polishing".

DeadMG writes that "What everyone considers "go-to" is irrelevant to your needs," while this is true to an extent. Using his logic - it is ok to continue writing applications in VB6 - however the problem is that no one will want to or know how to program in it. You are a lot more likely to hire someone who knows Python or PHP than someone who knows PL/1. And even if you have a person who knows VB6 - what are you going to do when he leaves or gets hit by a bus on the way in?

stackoverflow comment reply

04 January, 2014

Rebuttal to stackoverflow comment

I don't see any meaningful reason for any modern compiler to make more than one pass over the source code

This is quite an interesting claim considering there are 3 different languages that I can think of that perform multipasses. Those languages include Java, C# ( link and link ) and Python ( link ).

So, the concept of "looking ahead" is obviously present in C++, which is what is often referred to as "multi-pass compilation"

The illusion of looking ahead is not because of a multi-pass compilation at the parsing stage but it's just a look ahead.

gcc ( link ) uses a recursive decent single pass for C/C++. Recursive decent parser is a left-to-right single pass parsing algorithm ( link ). The recursive decent algorithm allows you look more than one token ahead.

Claims that C++ is "one pass" from that point of view are patently incorrect.

As stated - gcc utilizes a recursive decent parser which is only single pass.

C language does not require prototypes.

This is true.

However, there is a difference between what is true and what you should do.

According to the Open C book "Unless your stuck with an old compiler, function declarations should always be prototypes and this book uses the terms interchangeably.". Also stated "All identifiers in C need to be declared before they are used. This is true for functions as well as variables. For functions the declaration needs to be before the first call of the function. A full declaration includes the return type and the number and type of the arguments. This is also called the function prototype." ( link ).

This is perfectly valid C code: http://codepad.org/Jul73yc7

However, what is missing is what is spit out from the compiler. For both C89 and C99 it outputs for gcc -Wall sotest.c

sotest.c: In function ‘main’:
sotest.c:4:3: warning: implicit declaration of function ‘foo2’ [-Wimplicit-function-declaration]
sotest.c:4:7: warning: unused variable ‘i’ [-Wunused-variable]

Assuming the return type is of non-pointer. If the return type is a pointer then you must either use a function declaration or a prototype. Which according to the Open C book - we should use a prototype (however in this example it doesn't matter because the function takes no parameters anyways).

This does not compile: http://codepad.org/1UDVejt7

However when I add a prototype/function declaration it does compile - http://codepad.org/cFwswnWd

Here is another example of what happens when you lack a prototype: http://codepad.org/lqqpayZa

Declarations and prototypes are two completely different things.

According to the Open C book - "Older versions of the C language didn't have prototypes, the function declarations only specified the return type and did not list the argument types. Unless your stuck with an old compiler, function declarations should always be prototypes and this book uses the terms interchangeably." (link).

C++11 usage

15 December, 2013

I've got computers still on CentOS5 at work. Even once RHEL7 comes out it doesn't guarantee that everyone will upgrade any time soon.

Source

It's an interesting paradigm really. Many C++ developers say you must use C++11 features (that "must" is not a typo). Yet, many people still don't have a C++11 compiler or a compiler that may not have the C++11 features that you are using - or even have a gcc with different C++11 parameter -std=c++11 vs -std=gnu11

So, is someone supposed to bork their system by attempting to install a newer version of gcc and potentially end up in dependency conflicts? So ok, the newest Debian/Ubuntu/whatever has a more recent version of gcc in the new distro repos...however I don't understand why someone should be required to update their base OS just to compile your insignificant program (obviously if the distro drops support for that version - they have to upgrade). Will the author of the code help someone troubleshoot or upgrade gcc on someone's system?

← Older posts