header image
Hypocrisy in Hollywood
March 3rd, 2012 under Articles, Digital Rights, rengolin. [ Comments: none ]

Paralegal‘s Peter Kim sent me this nice info-graphic about a short history of the media industry in Hollywood, and I thought I would share with you.

I’m not a Lawyer, but his site seems to have some good bite-sized information about copyrights and other law terms that we should all know if we are to avoid The Big Brother in our society. Most of it obviously only apply to the US, but as we all know, US law has been extended to the world far too much. British hackers being extradited to US, European citizens getting harassed by US media companies and Asian companies being shut down by the mighty power of Hollywood.

There are other info-graphics on the site that are worth looking at. Thanks for the tip, Peter.


Eventually everyone wants to be AOL
January 25th, 2012 under Articles, Corporate, Media, Politics, rengolin, Web. [ Comments: none ]

After a good week battling against SOPA, it’s time to go back to real life, to battling our own close enemies.

As was reported over, and over, and over again (at least in this blog), Google is dragging itself towards a giant dominant player it’s becoming, much like Yahoo! and AOL in previous times.

Lifehacker has a very good post about the same subject (from where the title of this post was deliberately taken), around Google+ and the new Search+ (or whatever they’re calling that), and how the giant is loosing its steam and trying so solidify its market, where it’ll comfortably lay until the end of its days.

True, Google has a somewhat strong research department, and is working towards new TCP/IP standards, but much of it was done by Yahoo! in the past, towards FreeBSD, PHP and MySQL. Yahoo! actually hired top notch BSD kernel hackers (like Paul Saab), MySQL gurus (like Jimi Cole and Zawodny) and the PHP creator, Rasmus Lerdorf. And they put a lot back to the community. But none of that is true revolution, only short reforms to keep themselves in power for a bit longer.

The issue is simple, Google doesn’t need to innovate as much as they did in the past, as did Yahoo! and AOL. Even Microsoft and Apple need to innovate more than Google, because they have to sell things. Software, hardware and services, not only cost money, and time, but they age too rapidly and it’s not hard to throw loads of money at a project that is borne dead (like Vista). But Google get its money for free (so to speak), their users are not paying a penny for their services. How hard it is to compete with that model?

Like Google, Yahoo! had the same comfort in their days. They had more users than anyone else, and that was the same as money. They did get money from ads, like Google, only not as efficient. And that put them in a comfort zone that it’s hard not to get used to, which was their ultimate doom. This is why, after 25 and so years failing, Microsoft is still a strong player. This is why Apple, after being in the shadow for than 20 years, got to be the biggest Tech company in the world. The must innovate at every turn.

Yahoo! displaced AOL and bought pretty much everyone else because they’ve outsmarted the competition, by doing the same thing, but cheaper and easier. Google repeated the same stunt, on Yahoo! and is beginning to age. How long would that last? When the next big thing appears, making money even easier, Google will be a giant. An arrogant, slow and blind giant. And natural selection will take care of them as quick as it took of AOL and Yahoo!


The end is near, at least for software patents
January 29th, 2009 under Articles, Digital Rights, Politics, rengolin, World. [ Comments: 1 ]

Ars Technica has a fantastic article on software patents in US, and how the process is slowly reversing to what it should be (and was) since the beginning.

They describe all the history, important cases, different points of view and how the whole thing was going nuts in this century. The system was due to fail since the big companies started paying billions for patent trolls, but it took a bit too long to actually start reversing…

Would that be Obama’s aura? Or does both events mean that the US people finally started to think on their own? Whatever that is, it’s in the right direction, I think.


Who’s afraid of the big bad code?
January 14th, 2009 under Articles, Devel, InfoSec, Politics, rengolin. [ Comments: none ]

What would Bruce Schneier say about the magic list that the NSA is putting together with Microsoft and Symantec of the 25 biggest errors in code that normally lead to a security flaw.

Don’t get me wrong, putting out a list of bad practices is a fantastic job, that’s for sure. It makes programmers more aware of the dangers, and as the article says itself, newbies can learn from experience before getting into a new field.

But the way that (lay) people take it makes it so magical that the practical side of such list is greatly reduced.

Order and size of the list

I understand that the order must have some sense, but which? Is it ordered by number of attacks in the last 12 months? Or by the sum of all reported losses caused by them? Or by number of such errors found in common code (on those companies’ code, of course)? Or by any other subjective “importance” factor from a bunch of “Security Experts”?

Also, why 25? Why not 30? Who says that the 25th is so important to show up in the list and not the 26th?

Real-world

We programmers know about most of them, know the problems they pose and normally how to fix them. We often want to fix them, but that normally requires some refactoring and now it’s time to implement those features that our client needs for the demo, right? We can think about that later… can we? Will we?

Than, NSA decides to make this a priority for the country and claim it as a national security problem. Big companies like fancy terms, and would strive to adopt any new standard that shows up in the market.

Then, comes down the VP of engineering and say:

“We need to make sure every programmer knows how to write code that is free of the top 25 errors.”

Done, he can put the GIF image from the NSA saying his company’s software is secure against all odds, according to the NSA and DHS.

Now, coders and technicians, tell me: Would any editor, IDE or compiler ever be able to spot those errors with 100% accuracy?

“Then we need to make sure every programming team has processes in place to find and fix these problems [in existing code] and has the tools needed to verify their code is as free of these errors,”

Of course not, but they will try, and Microsoft will put a beta on Visual C++ and other companies will tell their clients that their software is being tested with the new product and the clients will buy, after all, who are them to say anything about that matter?

Protect against who?

Now, after so much time and effort, 30+ companies and government departments working hard to come up with a (quite good) list of the most common errors that lead to security flaws for what?

“The real dedicated serial attacker will probably find a way in even if all these errors were removed. But a high school hacker with malicious intent – ankle-biters if you will – would be deterred from breaking in.”

WHAT?!?! All that to stop script-kids? For heavens’ sake, I thought they were serious on that… Well, maybe I expected too much from the NSA… again…

(Note: quotes from original article, ipsis litteris)


Recursive hacking law
January 13th, 2009 under Articles, Digital Rights, InfoSec, Politics, rengolin. [ Comments: none ]

According to BBC, the new European strategy against cybercrime encourages the police to hack the hacker.

I just wonder if the European Union has any idea of what the word ‘hack’ really means or how gray is the area between white hats and black hats and, more importantly, that both types live on both sides of the fence! Ask a hacker to define hacking and you’ll need a comfy sofa and someone else to actually hear the whole story.

The only problem with that is that it’s recursive. Once the police (and the private sector) hacks me, they become a hacker themselves, allowing me to hack them, on the interest of security based on the same law. Right?


When the hunter becomes the hunted
July 22nd, 2008 under Articles, InfoSec, rvincoletto, Sponsored, Technology. [ Comments: none ]

The fast evolution of computer networks brought fantastic developments for communication and connection capacities.
We can easily see this evolution while observing the Internet, first a restricted network and now a complex and global network, where we can do a simple mail exchange or complex and elaborated financial transactions.
But, we also have the dark side of this fantastic environment: threats like virus, worms and Trojan horses, scanning, spoofing, sniffing or snooping, and so many others became the nightmare of all organizations.

Indeed, the technology can play for and against us.

A good way to make the technology works for us is using Packet Inspection. This is a tool frequently used to sniffing networks, looking for password and breaches, but information security professionals can use it to do exactly the opposite: protect the network.

Packet Analyser
With a good Packet Analyzer you can generate information about your integrated information systems, supporting the system administrator to find and solve the problems in a quick and efficient manner. It’s possible to identify attacks, non-authorized access to systems and malicious behaviors. In other words, with a good inspection solution your organization will be able to see and analyze everything that hits your network.

You can prevent problems and also reconstruct network sessions, providing the needed information for Network Forensics. It’s when the hunter became the hunted: you will be using the same method malicious threats use to put your business under risks to defend your organization.

Do you want to know what a Packet Inspection is? Watch this video for more information: Deep Packet Inspection explained or read here at Wikipedia.


Help us, Obi-Wan Kenobi; you’re our only hope…
February 18th, 2008 under Articles, Computers, OSS, rengolin, Web. [ Comments: none ]

After Yahoo! rejecting MS offer and all the fuzz about Yahoo! takeover now Yahoo! itself is breaking apart

No wonder the shareholders are mad, Yahoo! has been falling to pieces since Google got into scene and now with the $31 / share offer when it was barely holding it self above $20 the shareholders saw all the return for their investment happening in a very short time, what might be the last chance they have to see any money back at all.

So here’s a bit of futurology:

David Filo moves to Hawaii, shareholders sue Jerry Yang and he’ll end up very poor on his own Caribbean island, Yahoo! is bought by Microsoft by half the price (after the lawsuits there will be few left) and the shareholders will be very happy to, at least, get some money back.

All FreeBSD / Apache / PHP will be converted to Windows 2003 Server / .NET / C# and Yahoo! services will be even worse than they used to be, Microsoft will take the users and force them to start using Google services (no one likes to eat crap anyway) and Google will be the last hope of the Internet.

Fortunately Google is by far more efficient than Microsoft and Yahoo! together (it’s not that hard anyway) and it’ll be piece of cake to take them both down while still holding their hats with the other hand. I just hope Google doesn’t try to dominate the world as Microsoft is attempting for decades, they probably know by now that it’s like reaching the speed of light, the bigger you are the more energy you need to increase speed.

Microsoft and Yahoo! will still exists for a loooong time and Google will have a bit of competition for a while, at least until the “next-Google(tm)” shows up and put all three in the sack “with a wave of her hand(tm)” and the cycle will start all over again.

Let’s hope for the best, whatever that is…


Apple is current Microsoft, who’s next? Google?
September 20th, 2007 under Articles, Computers, rengolin. [ Comments: none ]

A friend sent me a link about the new monopoly/patents bastards: Apple Inc.

Apple was never worried about open standards, never tried to hide their intentions to block the Mac market by building a closed architecture-operating system-applications scheme. In that sense, Microsoft is almost open source. They were the first supporters, together with IBM, of the open architecture, the PC. In the past, it was quite easy to develop programs for DOS (using the magnificent Borland’s Turbo C++) etc, it was, in a sense, an open world.

I may say, in fact, that Microsoft tried to become the new Apple and failed miserably, to our own sake, because Apple never had much advantage in the market, only to those few posh non-hackers or weird designers. Today, Microsoft is being forced to open it’s servers’ protocols, more and more third-party compilers and IDEs (good free ones) are being added to the list, etc. It’s not a closed world in the strict sense, at least not as closed as the Mac world is.

But Google, always defender of freedom, openness, transparency (?) and good craftsmanship, fighting hard to end with the awkward and stupid patent system in US here and there ended up filling their own patent.

What happened? Not enough resources? Or are you playing on their (MS/Apple) own terms? Apple think the latter is more probable, so do I… They are now in direct competition with Microsoft, desktop search, Google Docs (with presentation) and they must fight in a field where MS and Apple dictate the rules and the rules are monopoly and patents, unfortunately…

Well, lets hope that the part of Google that wants to break with patents win before the other part (that are filling patents) get more damage to freedom…

Fingers crossed!


A Time Travel with the Backup’s History
May 26th, 2007 under Articles, InfoSec, Review, rvincoletto, Sponsored, Technology. [ Comments: 3 ]

When Techie guys talk about backup, maybe you don’t know exactly what they are talking about. So, let me explain a bit about backup.

Backup is a copy you made from a device to another with recovering purposes, in case you have problems with your original files. That’s an essential procedure for those using computers and others digital devices, such as digital cameras and MP3 players. Nowadays, most known backup types are CR-ROM, DVD, HD and Magnetic Tapes.

All important Operational systems have tools to implement your backup, but, there are thousands of powerful softwares out there, to create and restore your data.

Here’s an excellent article explaining all types of backup, history, and what type of backup is the best for you.

The following graphic shows a Backup Time Line, covering the most important backup strategies in the history. (click in the picture to maximize it)

Backup Timeline

You will learn how Punch Card Backups are a Reference Point in Backup History, and why they were replaced by Magnetic Tapes and Tape Backup.

The article also feature Backup using hard-drives and floppy disks, until our times, when people are using flash drives, Blu-Ray and HD-DVD to keep their data safe.

The article is a time travel, and you will learn when network backups began to be used, and why online backups are growing so fast.

It’s a interesting article for techie and non-techie people.


 


License
Creative Commons License
We Support

WWF

DefectiveByDesign.org

End Software Patents

Avaaz.org

See Also
Disclaimer

The information in this weblog is provided “AS IS” with no warranties, and confers no rights.

This weblog does not represent the thoughts, intentions, plans or strategies of our employers. It is solely our opinion.

Feel free to challenge and disagree, and do not take any of it personally. It is not intended to harm or offend.

We will easily back down on our strong opinions by presentation of facts and proofs, not beliefs or myths. Be sensible.

Recent Posts