Somehow your earlier posts gave me the impression that you had some familiarity with opensource and the whole process of software development - this last post makes me wonder.
There are two approaches to software patches (be it OS or application) - the fast & nasty and the slow & thorough - the first is the approach often adopted by neophytes, we have a problem, so we throw a few lines of code at it, give it a quick test and push it out the door - quite often those few lines of code fixed the first problem and introduced half a dozen others. The second is the reverse, we spend a little more time looking at how the few lines of code will impact the whole and then test a little more thoroughly.
The first approach is what leads to the wild accusations of insecurity that we have seen - a planned feature (http access to files) was pulled and someone simply removed the web page links that led to it, but left the code that allowed access.
The second approach is what leads to threads like this.
If you were the manufacturer, which road would you choose - the one that jepoardized the users data for a quick fix or the one that allowed you a better shot at releasing a quality product?
What would you as a user choose - a manufacturer who made firmware updates on a weekly basis (which by the way suggests either a seriously flawed code base - or - a product that was rushed to market before it was ready) and whose product you can't trust with your data, or one who took a little longer and a little more care?