Not too many years ago, it was rare to find a network power switch in a remote network equipment cabinet. Prior to the popularity of out of band management, many network administrators relied on an expensive service call whenever a network element at a remote site needed to be rebooted. But as the concept of out of band management has become more widely accepted, we’re now at a point where it’s pretty rare to find a remote network equipment cabinet that doesn’t include a network power switch.
Archive for the ‘Uncategorized’ Category
In most cases, network administrators generally have a fairly busy work schedule; their days are already filled with firmware updates, security concerns and responding to user complaints and questions. In a fast paced work environment like this, the last thing that a network administrator needs is a long field trip to deal with an unresponsive network element located at some off-site data center or remote network equipment rack. When a server or router at a remote site fails and disrupts network communication, often the problem can be solved by a simple power reboot, but unless the data center or equipment rack includes a remote reboot unit, that simple power reboot often means a long journey to the equipment site, just to flip a power switch off and back on again.
… And isn’t this the way that these things almost always work out?
AT&T has agreed to settle a lawsuit over the company’s DSL speeds — or more specifically the DSL speeds many users didn’t get. The class action in question was filed last year, and accuses AT&T of “breaching its contracts with and defrauding some of its customers” by imposing speed limits on their DSL tiers well below the advertised speed. As part of the settlement (pdf) AT&T agrees to no wrong doing, and will pay a maximum of $100 million — depending on the total number of class action participants. According to the Milwaukee Journal Sentinel, the attorney unsurprisingly gets the biggest reward:Under the pending settlement, some customers may get $2.90 for each month the speed of their DSL service was capped below the maximum rate for the plan purchased. For a longtime customer, that could be several hundred dollars. Some people may get a one-time payment of only $2 if they were dissatisfied with their DSL speeds but don’t meet other terms of the settlement, such as the capping criteria. Individual payments will be based on AT&T account records. The Ohio plaintiff who started the lawsuit is getting a $10,000 “incentive award.” Perotti’s law firm will get up to $11 million plus $3.75 million for charities.
Any AT&T DSL customer who signed up for service after March 31, 1994 is qualified to apply for cash rewards, though again — only customers who had their DSL line capped at a speed lower than their advertised price will see any significant reward (perhaps a few hundred bucks). Many AT&T DSL customers can get $2 just for being unhappy with their speed for apparently any reason.
Many users sign up for a faster speed and AT&T caps users at a slower speed — should they find that the user’s copper loop length is too great or there’s a problem with the quality of the line itself. Since that’s hard to predict, most operators use the murky “up to” qualifier in ads as a loophole for potentially slower service. Should a user sign up for a faster tier and wind up on a slower one — they should be informed of this and obviously should pay the lower price.
As part of the settlement, AT&T has agreed to monitor its customers for 12 months “to ensure they are consistently receiving the Internet speeds they are paying for.” Of course beyond the physical speed limitations of the hardware and line this isn’t always under AT&T’s control, making this kind of a false promise. There’s some odd bits at the end of the Journal Sentinel story, where the author confuses speed caps with monthly consumption caps, and an analyst from Parks Associates informs the paper that he’s “not really convinced it matters that much” if users see their advertised speeds.
Western Telematic, Inc. (WTI) designs and manufactures Remote Console Server products, Switched PDU products and A/B Fallback Units. WTI products are designed to solve common network problems and manufactured to endure.
FCC Unveils Details On Reclassification Plans – Genachowski, agency’s top lawyer highlight ‘third way’Thursday, May 6th, 2010
I guess this is good news if you’re a broadband consumer, but not-so-good news if you’re a broadband provider …
As noted yesterday, the FCC has decided to take a “third approach” to regulating broadband carriers, partially classifying them as “common carriers” under Title 2 of the Communications Act. The agency spent last night and this morning briefing carriers, Wall Street analysts and politicians on their plans. Under this partial Title 2 classification, ISPs won’t be forced to open their networks to competitors (something that would be required were they fully classified as common carriers).
FCC boss Julius Genachowski issued a statement (pdf) this morning with more detail — as did FCC General Counsel Austin Schlick (we’ve got a .doc copy of his statement here for those interested). According to Genachowski, the goal with this move is “to restore the broadly supported status quo consensus” that existed prior to the FCC’s court loss to Comcast over throttling upstream user traffic. Specifically, the FCC wants to:
•Recognize the transmission component of broadband access service and only this component as a telecommunications service;
•Apply only a handful of provisions of Title II (Sections 201, 202, 208, 222, 254, and 255) that, prior to the Comcast decision, were widely believed to be within the Commission s purview for broadband; The entire Communications act is here (pdf).
•Simultaneously renounce that is, forbear from application of the many sections of the Communications Act that are unnecessary and inappropriate for broadband access service; and
•Put in place up-front forbearance and meaningful boundaries to guard against regulatory overreach.
Again, this move is going to be debated by policy wonks for years — but the gist is that the FCC’s trying to find some kind of middle ground between carriers (who obviously would love an FCC that had no authority of any kind) and consumer advocates (who want a tough regulator on the beat crafting new regulations). This partial reclassification strips all but six of the roughly four dozen Title 2 rules, and won’t involve regulating broadband rates — or requiring that ISPs open their networks to competitors.
There’s still a lot of detail to be hashed out here, and the proposal must be opened to public comment and then receive approval from at least three of the five FCC Commissioners. Genachowski appears to be trying to please everybody — which may or may not work. We’ll post reaction links from impacted parties below.
Comcast has this to say about the decision:While we are disappointed with the inclination not to lean in favor of Title I regulation, we are prepared to work constructively with the Commission to determine whether there is a “third way” approach that allows the Commission to take limited but effective measures to preserve an open Internet and implement critical features of the National Broadband Plan, but does not cast the kind of regulatory cloud that would chill investment and innovation by ISPs.
Consumer advocacy firm Public Knowledge issued this statement:We are generally very pleased with the FCC s statement this morning. We have said for months that the right path for the Commission to take would be to examine all the possibilities for the best way to protect consumers and guarantee the expansion of broadband. The method the FCC is expected to propose should be on the table, and we are glad it is. “Having said that, we were not pleased to read that the Commission at the outset is foreclosing the possibility of requiring line sharing. As the Berkman report found, line sharing is a crucial method to ensuring the long-term vibrancy of the broadband market and to providing more choices for consumers.
Free Press says they’re “encouraged” by the plan:By putting the FCC’s regulatory framework back in harmony with congressional intent, Chairman Genachowski is reversing one of the worst deregulatory mistakes of the past decade. This is a step in the right direction that rejects the special interests of giant network owners. But he should be cautious about throwing out rules that would promote competition and affordability. The Chairman s plan appears to preemptively abandon important provisions of the law that serve consumers.
More security issues with Facebook …
Private chat messages and pending friend requests visible to other Facebook users
Facebook shut down its chat system yesterday after it emerged private conversations were visible to other users.
Any user was able to view the live chats of their friends, as well as their pending friend requests, until the social media site was alerted to the problem and took Facebook chat offline.
The glitch, reported by technology blog TechCrunch, meant that people clicking on the “Preview my profile” button – which enables users to see how their information is shown to certain friends – were given information from their friends’ accounts.
The world’s biggest social media network has been under scrutiny over fears its new system, which opens up users’ activity to other sites, makes it too difficult to keep information private.
“For a limited period of time, a bug permitted some users’ chat messages and pending friend requests to be made visible to their friends by manipulating the ‘preview my profile’ feature of Facebook privacy settings,” a company spokeswoman said.
“When we received reports of the problem, our engineers promptly diagnosed it and temporarily disabled the chat function. We also pushed out a fix to take care of the visible friend requests which is now complete. Chat will be turned back on across the site shortly.
“We worked quickly to resolve this matter, ensuring that once the bug was reported to us, a solution was quickly found and implemented.”
The company issued a statement on the Facebook fan page explaining the absence of the chat feature to users.
“Chat is unavailable as we work quickly to fix a bug reported to us,” it said. “It should return to normal soon … We apologise for the inconvenience.”
This morning nearly 5,000 people had said they “liked” Facebook’s comment.guardian.co.uk © Guardian News & Media Limited 2010 | Use of this content is subject to our Terms & Conditions | More Feeds
Here’s another interesting battle between technologies …
We should be enjoying a flood of new ARM-based mobile internet devices, but we’re not. Apparently they’ve been stalled while waiting for Adobe to release Flash Player 10 and AIR…
Like other people briefed by ARM, I really expected one or two dozen ARM-powered netbook-style systems or media tablets to appear at the Consumer Electronics Show in January. They didn’t, but why not? Why aren’t there loads already in the shops?
ZD Net’s David Meyer reports that, according to ARM’s marketing vice president, Ian Drew, “events have conspired to stall this plan”. A story headlined Smartbooks have been delayed by Flash issues, says ARM, quotes Drew as follows:
“I think one reason is to do with software maturity. We’ve seen things like Adobe slip – we’d originally scheduled for something like 2009.”
ARM and Adobe signed a partnership in late 2008 that was intended to see Flash Player 10 and Air – both rich web platforms – optimised for ARM-based systems. That work is only likely to come to fruition in the second half of this year, when an optimised version of Flash comes out for Android smartphones. As Apple’s Steve Jobs recently pointed out, Flash was originally supposed to ship for smartphones in early 2009.
Shanzai.com’s response is that ARM bashes Flash Unfairly for Lack of Smartbooks, and it points out that the media see “Flash as the new kid on the block to bully”. Yes, the sudden shift of interest from netbook-style devices to tablet formats probably slowed things down. Still, lack of Flash for ARM chips can’t have helped.
Another factor is the failure of Linux on netbooks, because – to ARM’s chagrin – ARM chips cannot run Microsoft Windows. Meyer writes:
“Some of it is also related to there not being many Linux [netbooks] out there either,” Drew added, pointing out that ARM’s architecture cannot support x86-based applications from the PC. “We’ve only got Linux. If you look at forecasts for Linux netbook sales last year, not as many were sold.”
However, this shouldn’t matter much to the nascent tablet market, where Apple’s iPad has demonstrated that you can sell loads of machines without running Windows. And in ARM’s case, as in Apple’s case, Windows shouldn’t matter. The iPad is basically a big iPod Touch (or a big iPhone without the phone part), and an ARM-based tablet is most likely to be a big Google Android phone. In other words, a system written to be driven by the end of your finger, rather than (like Windows) a stylus or a mouse.
This kind of system should start to become widely available fairly soon. Indeed, Zedomax did a hands-on review of a prototype system this week at Web 2.0 Expo in San Francisco. You can get some idea of what it looks like from Zedomax videos on YouTube, with the second one (Android Multi-Touch Tablet Prototype Hands-on Video Review! Part 2 – above) attracting particular attention. Apple fanboys and anti-Flash folks reckon it shows the browser crashing when running Flash, while Zedomax says it doesn’t. On YouTube, he says:
Hey if u dont believe me I will go take another video tomorrow and prove u it didnt crash and show u all the “hidden” button I was using, u can see clearly in the video, there’s a slight moment of pause where i am fiddling with the back button and I accidentally hit the home button, geez, #idontlie folks.
But really, a prototype crashes a browser: who cares? Apparently Steve Jobs reckons that Flash crashes on Macs running OS X all the time.
People who support Google’s Android and Chrome OS ventures – which are both based on Linux – will value the fact that they are open source, so users can do whatever they like with them. Android tablets and other devices provide freedoms that will never be officially available in the iPad’s closed, proprietary system. The freedom to run Adobe Flash could be just one of them.guardian.co.uk © Guardian News & Media Limited 2010 | Use of this content is subject to our Terms & Conditions | More Feeds
Western Telematic, Inc. (WTI) designs and manufactures remote management equipment for data centers, laboratories and IT centers. Our comprehensive line of products includes Secure Console Server products, Switched PDU products and A/B Fallback systems.
3D TV sounds like an interesting gimmick, but I’d be more impressed if the TV industry would just focus on producing more watchable programs …
Audience figures likely to lag behind early rush to buy 3D-ready televisions, analysts predict
More than 800,000 households will have bought a 3D-ready television before the end of this year, but fewer than one in eight will actually be watching 3D programmes on it, a new report predicts.
This year has been predicted to have a “3D summer” as consumers flock to buy 3D TV sets following the hype surrounding the hit film Avatar and the promise of viewing sport such as football via the new technology from BSkyB. But the report from Informa Telecoms and Media forecasts that while 845,000 households worldwide are likely to have 3D TV-ready sets by the end of the year, just 101,000 homes will be watching 3D shows.
By the end of 2015, 3D-ready sets will have reached 70m households, some 5.1% of all homes that have a television. But once again a huge number – up to 68% or 48m – will not actually be watching 3D TV shows on them. Just over 30%, or 22.2m homes, will be watching 3D TV shows.
The report lists a number of factors contributing to the gap, including a significant number of viewers not realising that they must also have the right set top box and/or subscription package.
A similar issue plagued the surge of interest at the arrival of high-definition TV in time for the 2006 World Cup, with many viewers not realising they needed the right equipment to receive it. Simon Murray, the analyst behind the report, said that this was not likely to be a major issue with 3D TV as it was instantly obvious when the programming is not multi-dimensional.
Other factors include consumers buying TV sets more for 3D DVDs or playing games in 3D than an interest in television shows.
The report says that while just 22.2m TV households globally, some 1.6% of all homes with televisions, will be watching 3D programmes by the end of 2015 the market will still be immature at that point, with “significant growth opportunities” to come.
Issues with the cost of 3D TV sets, lack of and cost of production should start to be alleviated with the promise of massive events, such as the London 2012 Olympics, set to be a sales showcase for the technology.
By the end of the year the number of homes watching 3D TV programming globally will be 101,000. Of those 60,000 will be in North America with just 22,000 in Western Europe.
By the end of 2015 6.8m of the 22.2m households worldwide that will be watching shows in 3D will be from Western Europe. The UK is expected to have 1.6m 3d TV households by the end of 2015. North America will have 9.2m homes with Asia Pacific at 4.6m.
• To contact the MediaGuardian news desk email email@example.com or phone 020 3353 3857. For all other inquiries please call the main Guardian switchboard on 020 3353 2000.
• If you are writing a comment for publication, please mark clearly “for publication”.guardian.co.uk © Guardian News & Media Limited 2010 | Use of this content is subject to our Terms & Conditions | More Feeds
For over 30 years, Western Telematic, Inc. (WTI) has been an innovator in the field of remote management for IT facilities. Our comprehensive product line includes a wide range of Serial Console Server products, Switched PDU products and Remote Reboot Switch products to provide secure, remote management of servers, routers and other devices.
There certainly has been a great deal of news coverage of this story; I guess a good “Browser Battle” is always interesting …
The browser war between Microsoft and Google has intensified. The software and Internet search giants are going head-to-head in attracting and keeping users of their popular Internet browsers.
Microsoft, which has dominated the browser market for years, is quickly losing market share to Google’s Chrome browser. That was made clear when Microsoft’s share of the browser market fell below 60 percent in April, while Chrome’s share increased by nearly 25 percent.
The battle between the two grew stronger on Tuesday when Google shared new features of its browser, Chrome 5 beta for Windows, Mac and Linux. The release of Chrome 5 beta came just six weeks after Microsoft provided developers with a sneak peek at Internet Explorer 9 touting some its new features.
Improvements in speed also means users will be able experience web pages load at 2,700 frames per second, a faster rate than what competitors, Mozilla Firefox, Apple Safari, Opera Software’s Opera and IE offer.
Support, Not Speed
While a speedy browser is a must to Internet users, it is not the differentiating feature.
Consumers and developers are seeking support for the multiple applications that will run on the browsers.
At the core of the Chrome 5 beta are new HTML5 features including Geolocation APIs, App Cache, and drag and drop capabilities, according to the company. Also included for the first time is an integrated Adobe Flash Player plug-in, which enables users to browse secure, rich web sites….
Western Telematic, Inc. (WTI) designs and manufactures remote device management products for IT applications. WTI’s Serial Console Server products, Remote Reboot products, Switched PDU products and A/B Fallback products are engineered to allow you to securely manage and troubleshoot rack equipment in remote locations.
This is certainly something to think about when using social networking sites …
When Facebook announced at its f8 conference last month that it is making changes to its giant social networking site, I didn’t pay close enough attention.
I was aware the company was changing the way its site interacted with others on the Web. I also understood after logging into the site that Facebook was altering the way it displayed some of users’ personal information, including their educational background and interests.
But I overlooked the significance of those changes until last week, when a co-worker made me realize what they meant, and just how cavalier Facebook is with the privacy of its users.
As I’ve mentioned in past columns, I’m a big fan of Pandora Internet radio. Generally, if I’m listening to music on the radio, I’m tuned into one of my stations on Pandora, not a traditional broadcast radio channel. I’ve shared my stations with my wife and a few family members. But I never intended to share them — or my listening habits, including songs I’ve “liked” — with anyone else.
Thanks to Facebook’s changes, I suddenly was sharing them with a lot more people. My colleague showed me that when he went to Pandora and signed into Facebook using a new widget on the music site, he was able to see everything that I’d been listening to lately, including what songs I’ve given a thumbs-up. And it wasn’t just my Pandora activity he had access to. He could check in on the Pandora habits of any one of his friends on Facebook who also had a Pandora account.
After seeing this demonstration, alarm bells went off in my head over the privacy threats that Facebook’s changes pose.
It’s not that I particularly care if my colleague can see what music I’ve been listening to lately. But I have 635 Facebook “friends,” many of them…
Western Telematic, Inc. (WTI) designs and manufactures remote power control and remote port access products for the IT industry. Our Outlet Metered PDU products and Console Port Server products provide valuable tools for any IT manager who needs secure, remote access to power control and command functions on rack mounted IT equipment.
A device that includes both console port access and power management capabilities is a benefit to any network. A hybrid console server allows access to command functions as well as the ability to turn the unit on or off or to reboot the unit if necessary. Normally, a standard console server would only allow you to see that the unit was not working, but would not include the ability to physically reboot a device, thus defeating the purpose of the remote access. The same goes for a power unit – you may be able to turn it on and off and reboot it, but you do not have access to the unit’s command functions in order to see what was happening within the unit. Adding power management to a console server makes the unit a truly remote unit.
Another benefit of having a power option on your console server is that if network switches that are connected to the unit happen to go down, it is possible to not only see that they are down, but also to reboot them and watch them restart. Being able to view the reboot through the console port means that you would be able to look for problems or errors in the start-up scripts that could indicate why the device went down in the first place. This helps to maximize up-time and network efficiency, all without having to physically access the devices themselves. This means that when a network switch goes down, you don’t need to spend additional money to send a person out to fix the problem. It can all be done remotely.
Another advantage of a console server and power combo unit is that it maximizes rack space by combining two units into one. One unit of rack space equals eight console ports and four switched power outlets. This means that space can be better utilized and saved for important network equipment.