Subscribe:

Ads 468x60px

Wednesday, September 23, 2009

Google says the “keywords” meta tag doesn’t affect its search rankings



Web developers will be “delighted” knowing that the time they spent optimizing the keywords meta tag won’t affect the ranking of their pages on the Google search engine even the least little bit.
In a surprising revelation slipped into a blog post, the search giant has finally come clean on a matter that has been torturing developers for years, confirming that the Google search engine does not use the “keywords” meta tag in ranking the web search results. Only the Google Search Appliance box specifically sources, among others, the keywords meta tag and only if you specifically match search results against the supported meta tags.
The consumer Google search engine, however, completely disregards any information that’s embedded in HTML pages via the keywords meta tag. The search giant has vividly illustrated this with the following example:
Suppose you have two website owners, Alice and Bob. Alice runs a company called AliceCo and Bob runs BobCo. One day while looking at Bob’s site, Alice notices that Bob has copied some of the words that she uses in her “keywords” meta tag. Even more interesting, Bob has added the words “AliceCo” to his “keywords” meta tag. Should Alice be concerned? At least for Google’s web search results currently (September 2009), the answer is no. Google doesn’t use the “keywords” meta tag in our web search ranking.
Google search results: Description meta tag
Google has finally confirmed that it does not use the "keywords" meta tag in search ranking, but it may source the search results snippets from the "description" meta tag.
Google wrote it had stopped using the keywords meta tag “many years ago” because it was too often “abused” with irrelevant keywords. The company did stress that its ranking algorithm uses other meta tags, such as the “description” meta tag sourced for search result snippets. To clear up any possible ambiguous interpretation, Google’s software engineer Matt Cutts said the following:
If you’re looking at the keyword meta tags, we really don’t use that at all. So don’t bother to get frustrated if somebody else is using your name in the keyword meta tags - it’s really not worth sueing someone over this. At least for Google, we don’t use that information in our ranking even the least little bit.
Read more at the official Google Webmaster Central blog

Google explains why they’re not using the “keyword” meta tag
No video? Watch it on YouTube!

Christian’s Opinion

This news should come as a relief to web developers who have been spending disproportionate amounts of time on choosing the right keywords for the keywords meta tag, hoping this will in some way positively affect their ranking on Google. At least web creators can be positive that they won’t gain (or lose) anything by embedding  meticulously chosen search keywords into HTML pages.
Of course, this is only true for the Google search engine. Other search engines that use different ranking algorithms could rely on the keywords meta tag. On top of that, there’s nothing stopping Google from sourcing the keywords meta tag in the future. Because of all this, developers should not surrender the keywords meta tag but they should stop focusing so much on optimizing the keywords for Google’s main search vehicle. Pity that Google hadn’t cleared this up a few years ago.

An updated Google Talk now plays nice with Snow Leopard


Google has updated the Google Talk video and audio chat plugin for Mac, fixing half a dozen Snow Leopard-related issues. The updated software now works as expected on Apple’s latest cat.
This past weekend, Google has updated the Google voice and video chat client for Mac OS X. The new 1.0.15 version addressed a number of compatibility issues with OS X Snow Leopard, like the Safari’s inability to detect the presence of the Google Talk plugin or an issue where the plugin would delay system sleep by 30 second. Another issue where the video wouldn’t show up in a video chat has also been resolved. In addition, problems with uninstalling and reinstalling the same version and registration for automatic upgrades have been resolved.
The software will automatically update itself on Tiger and Leopard system but Snow Leopard users who have previously installed an older version of the plugin will need to manually install the latest version here. The company advised users to restart the system after the update in order to avoid issues with video not being displayed.
Read more at Google’s Talk About blog.

LG Showcases Technological Leadership in Home Appliances at IFA 2009

LG Electronics (LG), an innovator and leader in home appliances, will unveil its latest home appliance products at IFA 2009, held in Berlin from September 2 to 4. Under the theme of “Upgraded Life by LG Home Appliance Innovations,” the company will demonstrate how it inspires and improves consumers’ lives by providing innovative and sophisticated products. Most of all, IFA 2009 will be the first public showing of its 11kg washing machine1 – the largest ever in a standard 24’’ cabinet - which offers total efficiency for doing laundry of any size.

“We study consumes’ lifestyles and reflect their needs in our products. More recently we’ve listened to customer complaints about how their lives are impacted by having to wash clothing frequently, causing them to sacrifice a significant amount of time for themselves and family,” said Mr H.S. Paik, President LG Electronics Gulf FZE. “At IFA 2009, we are unveiling a solution to this household chore. Armed with LG’s leading Inverter Direct Drive™ technology, innovative damping and fixed tub system, our new washing machine dramatically upgrades our consumers’ life, saving time, energy and effort through the largest-ever capacity of 11kg1.”

LG’s home appliance highlights at the show include a series of innovative and stylish products comprising washing machines, refrigerators, vacuum cleaners, cooking appliances, built-in appliances and dish washers. Along with the products, LG will also feature its leading smart technologies that are built into the devices, including its Linear Compressor, Inverter Direct Drive™, dust compression system. IFA 2009 will be the great opportunity to experience how your life can be upgraded by LG’s latest innovations.

LG’s booth is located in Hall 1.1 and will consist of three zones, representing efficiency, convenience, and eco-friendliness. An oversized washing machine gate and drum representing LG’s innovative 11kg washing machine’s key benefits and features will grab the attention of stand visitors. Highlights of LG’s participation at IFA 2009 include:

Don’t Underestimate the Importance of The New Ovi SDK

Whilst at Nokia World this year, I noticed in the Experience Lounge, a display table, with a couple of TFT Monitors, and a notice saying that something would be announced the following day, the second day of Nokia World.
1
On the second day, I went back to the Experience Lounge to take a further look, and find out what this new announcement was. It was the new Ovi SDK, which in my opinion has slipped under many people radars this year, with most not realizing the importance of such development tools.
2
Luckily, our Social Media Group got some priceless one to one time with the guys behind the Ovi SDk, who were more than happy to show us exactly what it can do, how to do it, and demonstrate right in front of us, how easy it is to use.
If you’re not a developer yourself, but interested in possibly developing in the future, then read on. (Click Read More).
Ovi SDK is a developer toolkit that’s been worked on for the past year that exposes JavaScript APIs to Ovi Services on mobile. The Ovi SDK, available right now in beta, enables you to create contextual apps for Nokia devices, what are known as Ovi apps, and API’s..
Ovi SDK [800x600] - Share on Ovi
If you have heard of Carbride, another earlier development tool, which although was very usful, it was overly complicated, and it was this complication that put many upcoming, or want to be developers. The Ovi SDK toolkit, simplifies this process, making available all the tools needed to develop, and in the simplest way, with many templates to use.
The first APIs that are available are for mapping and navigation.
03092009131 [800x600] - Share on Ovi
The Ovi Maps player API for web has been in beta for a few months and now bringing those to mobile. High level JavaScript APIs to vector based 3D maps, satellite pictures and a whole lot more. Map data is based on Navteq maps with best in class data and global coverage. And best of all, for you we make it both open and free. The Ovi Navigation player API introduces turn by turn navigation and routing APIs for pedestrian and car navigation. Again, for you, free. The SDK itself is runs as a plugin in your Firefox (on PC) and Safari (on Mac) browser.
03092009128 [800x600] - Share on Ovi
Now I, as a non-developer, asked during the meeting just how easy it was to use one of these templates, and if they could show us exactly what process was involved in this, and how long it would take. Within seconds, the first template was launched, and within less than a minute the custom named app on maps was created. So simple, even I consider myself possibly giving this a try myself.!
03092009132 [800x600] - Share on Ovi
The Ovi team, call these Ovi APIs “players” because they are like media players: easy to embed in your web site, and now easy to use in your mobile app. With one line of code, your app can enable car navigation or pedestrian navigation to the destination.
The Ovi SDK is based on web technologies. With basic knowledge of HTML, CSS and JavaScript, You can create Ovi apps for mobile and embed the Maps player in your web site. They have even gone one step further in the SDK by including a UI library for mobile devices that automatically scale to different device form factors.
Ovi apps are contextual. What does that mean? It means that the apps put the user in the center. Literally. By using positioning APIs and time, the app can filter out information that is relevant for the user “right here, right now” without the need for even a search query.
The map is great for visually display large amount of data on small screens. With navigation, the Ovi Apps can bridge the virtual and real world by guiding the users to a location he or she has discovered.
Ovi SDK launched recently as a developer preview (i.e. in an early beta) to get your feedback. Go to http://www.forum.nokia.com/ovi, apply for the beta program, join the conversation in the discussion boards and let the guys know what you think!
Source of some info Ovi Blog, Nokiausers & Actual Meeting at Nokia World.

Augmented Reality in a Contact Lens

A new generation of contact lenses built with very small circuits and LEDs promises bionic eyesight.
This article from the IEEE Spectrum web journal shows the prospects for accessing images into the eyeball from the surface of contact lenses, thus enabling pupils to overlay the scene they are viewing with notes, images and graphics from an e-learning or teacher source. To turn such a lens into a functional system, the system can integrate control circuits, communication circuits, and miniature antennas into the lens using custom-built optoelectronic components. Those components will eventually include hundreds of LEDs, which will form images in front of the eye, such as words, charts, and photographs. Much of the hardware is semitransparent so that wearers can navigate their surroundings without crashing into them or becoming disoriented. In all likelihood, a separate, portable device will relay displayable information to the lens’s control circuit, which will operate the optoelectronics in the lens. These lenses don’t need to be very complex to be useful. Even a lens with a single pixel could aid people with impaired hearing or be incorporated as an indicator into computer games.
http://www.spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens

Network and wireless analysis: Grid computing

The definition of grid computing entails the notion of making computing power...

At a glance

  • Grid computing involves aggregating processing capacity, storage and other resources to achieve a task that would be beyond the resources of a given institution.
  • The electricity grid is used by some as a model for grid computing, but processor power 'on tap' is a long way from reality for complex projects.
  • Volunteer computing projects rely on individuals donating spare 'processing cycles' on their PCs to solve problems in areas such as medicine, astrophysics and climatology.
  • Large academic research projects, such as those planned for the Large Hadron Collider (LHC), rely on grid computing to analyse their vast data sets.
  • Issues around security and mutual trust when 'donating' capacity are balanced by advantages of higher resource utilisation and contributing to worthwhile projects.
  • Educational networks could be used to advance volunteer projects or for in-house processing tasks, but security issues may limit the former while the latter may be more effectively achieved through 'cloud' services.

Getting on the grid

The broad definition of grid computing, otherwise known as utility computing, entails the notion of making computing power as available as the national grid - some strategists foresee a time when you will be able to plug a terminal into a 'wall socket' and get all the computing power you need. This view simplifies the current state of computing to 'pure' processor power, analogous to electricity, without reference to all the complexities of differing processor architectures, storage requirements, peripheral interactions and a host of other factors. In many respects cloud computing (see TechNews 11/08) offers these facilities by providing computing power and storage via the internet; the user does not know where those servers are located, but can lease the capacity required.
'Grid computing', in more common use and as discussed in this article, refers to a form of distributed computing whereby users can access spare capacity on other people's resources to deal with tasks that would take far too long on in-house hardware. Provision is enabled by a complex web of co-operative pacts and predefined service level agreements (SLAs) that are a far cry from the 'plug in, use now and get billed after' vision of utility computing. As defined on Wikipedia:
Grid computing (or the use of computational grids) is the combination of computer resources from multiple administrative domains applied to a common task.
This definition indicates one of the key features of current computing grids: heterogeneity. There are many computing platforms and a whole mass of communities, research projects and nascent standards, only some of which will be covered in this article.
Grid computing is most suited to scalable, massively parallel computing tasks. These applications can generally handle out-of-order processing, with algorithms that deal with late or missing results, and rely on 'message passing' protocols to control execution by allocating tasks, sharing progress and transferring completed data to the appropriate point. Such tasks include searching very large data sets, video rendering, climate simulations, genome analysis, processing particle physics data and drug research. Some projects involve 'volunteer computing' where people grant access for applications to run on spare processor capacity while their computer is idle. One the most widely known examples is the SETI@home project, searching for signals attributable to intelligent sources among the radio background 'noise' of the universe. Some projects allow the greatest contributors to propose their own tasks to be run on the virtual, networked processor.
Many large academic research projects also use grid computing, taking advantage of facilities in partner organisations to process data during idle time, perhaps at night or between in-house applications.
Educause has a helpful article, 7 things you should know about Grid Computingand the Worldwide LHC Computing Grid (WLCG) has published Grid computing in five minutes.

The structure of the grid

The grid is inherently heterogeneous, a loose collection of processors, storage, specialised hardware (for example particle accelerators, electron microscopes and particle accelerators) and network infrastructure. For each task, appropriate hardware has to be discovered, processor time booked, network capacity scheduled (especially where large data sets are involved) and collation of results organised. Although this can be achieved on a peer-to-peer basis (in which no one machine has overall control), it is generally arranged as a client-server structure. 'Middleware', is often utilised to manage the applications and resources required to achieve a particular outcome, such as the Globus toolkit or Berkeley University's BOINC software (both of which are open source).
The complexities of managing grid applications are offset by significant advantages, including:
  • access to resources beyond those available within a given institution
  • optimisation of spare capacity
  • flexibility to scale and reconfigure available resources
  • avoidance of single points of failure in the computing infrastructure used
  • data replication across a number of facilities
  • provision of 'virtual' resources in-house, so that experienced researchers are less tempted to go to institutions elsewhere.
Academic institutions have created partnership groups for sharing resources, notably GridPP(for particle physics tasks in the UK), the EU's EGEE science network and the UK's National Grid Service (NGS); while international directories like the GridGuide provide international contacts. The Open Grid Forum (OGF) has been behind a number of substantive projects, especially developing standards for the protocols required to deliver and manage grid applications.

Volunteer computing

Volunteer projects are the simplest structure of grid computing: a server provides an application for users to download and a series of 'work units' to be processed during the processor's idle time; each work unit is independent, so results can be returned in any order. However, the researchers running the application do not know whether the user or client PC will produce accurate, authentic results, so tasks are generally randomly duplicated between users, with results compared to ensure validity. The owner of the client PC has to manage the installation and patching of the client application, while trusting that the application provider is doing the work purported, that no malware is being delivered and that the application will not interfere with the operation of the computer. Networks of PCs in schools and colleges could contribute huge numbers of spare processing cycles to these projects, but management overheads and security concerns often deter system managers from volunteering their resources.
Applications include research into disease, medicines, climate change, astronomy and particle physics. GridRepublic and the World Community Grid allow users to select the projects they wish to contribute to, while Intel is promoting its volunteer projects through Facebook. Many projects, such as the protein folding simulation Folding@home, now support processing using games consoles and the parallel instruction pipelines found on graphics processors. (See 'GPU computing' in TechNews 09/08.)

Research networks

Collaborative networks of academic researchers can assume that the infrastructure is trusted, diminishing the problems faced by public volunteer projects. However, the actual tasks are often far more complex, involving very large data sets and a much greater range of hardware, from desktop PCs through to supercomputers.
The Large Hadron Collider (LHC) will be reliant on massive grid computing capabilities to process the data that it is expected to produce. The WLCG has 11 Tier 1 and 140 Tier 2 data centres that will distribute the 15 million gigabytes (15 petabytes) of data created each year. The primary fibre optic network links will run at 10Gbps, allowing data transfers of several gigabytes per second through clustered channels.
Computing facilities at this scale represent a considerable investment, so the prioritisation, scheduling and job control are critical to effective use. A number of projects and protocols (in the widest sense) are focussed on this issue. For example:
  • GridFTP is an extension to the standard internet file transfer protocol (FTP), allowing much larger blocks of data to be simultaneously and securely transmitted across multiple channels, as well as providing the facility for just part of a single, extremely large file to be downloaded.
  • GridCOMP and Phosphorus are assembling frameworks and services to facilitate higher level project management.

Commercial opportunities

Large companies, including those involved in pharmaceuticals, aerospace simulations, data mining for market research and prospecting, also have immense processing requirements, but the data they handle can be commercially sensitive. SLAs must be legally watertight, covering issues like security, intellectual property and data protection (especially where personal information is held).
The function of GridEcon is to create an online auctioning, scheduling and management system for computational capacity. The EU's SIMDAT project had a wider remit, investigating all elements of grid computing, from protocol standardisation through to systems that allow companies to readily create virtual organisations through which they can define projects involving establishing, administering and securely taking down distributed processing and storage capacity.

The grid and the cloud

Many applications already run in the 'cloud', leasing facilities such as Amazon's Web Services (AWS) or Microsoft's soon to be launched Windows Azure Platform. Although these may use a distributed computing model to provide the services, they have a single point of accountability through the provider's SLA. The grid computing applications outlined in this article are far more complex, but they can provide computing power for 'free', or at a substantially reduced price, for academic researchers, while ensuring near full utilisation of expensive computing resources. This grid remains more informal in structure, collaborative in development and altruistic in nature, although it is becoming more formalised as the environment matures and the scale of individual projects increases, especially as commercial entities begin to adopt these approaches.
Educational establishments could consider donating spare computing cycles to advance areas of research considered to be for the good of humanity, although they need to factor in the management overheads that deployment is likely to incurand consider whether it will add significantly to energy consumption. Middleware, such as BOINC, could be deployed across a large institution to manage in-house processing tasks, or capacity could be leased from one of the cloud providers. However, access to massively scalable, grid computing resources is likely to remain the province of research organisations based in higher education and industry.

AMD Systems - AM2 to AM3 upgrade overview - part 13

Now let’s see the results obtained on the 3D benchmarks, more specifically 3D Mark Vantage
Phenon II X4 940 (3Ghz)
- CPU Score - Extreme - 5968
- CPU Score - High - 35915
- 3D Mark Score - Perf - 9237

Phenom II X2 550 (3.1GHz)
- CPU Score - Extreme - 5956
- CPU Score - High - 31740
- 3D Mark Score - Perf - 9256
Athlon 64 X2 5000+ (2.6Ghz)
- CPU Score - Extreme - 5880
- CPU Score - High - 27006
- 3D Mark Score - Perf - 8859
The performance increase in the 3D gaming scenario changes sensibly from title to title: World in Conflict, for example, seems to scale perfectly when there’s an increase in the working frequencies and the number of cores, while Crysis doesn’t seem to know what to do with the two extra cores featured on the Phenom II X4 940. The latter seems to be an exception, however, as most games seem to be optimized to the multi-core scenario.

Intel CEO:- PC sales could rise in 2009

(AP) -- The worldwide personal-computer market is pulling out of its slump quickly and could defy predictions by growing this year, Intel Corp. CEO Paul Otellini said Tuesday.

Otellini's comments at a conference Tuesday were more bullish than many analysts have been. Market research firms IDC and Gartner have both predicted a year-over-year decline in PC shipments in 2009, which would be the first such drop since 2001.
The market has been dragged by a clampdown in corporate spending on new PCs, and some computer companies are already looking to next year for a rebound. Sales of cheap little "netbook" computers, used primarily for surfing the Internet, have been a bright spot, but those machines ring up low profits for PC and chip makers. is the world's top maker of microprocessors, the "brains" of PCs.
Otellini said the rebound is being fueled by the fact computers are "indispensable, something that people need in their daily lives."
"I think that the market is poised for a resurgence," he said. He said he expects PC sales to be "flat to slightly up" this year from last.
Researchers at Gartner Inc. predict a 2 percent decline in for 2009, though that's better than a few months ago, when the group was forecasting a drop of 6 percent.
"Things are looking much better in the second half," Gartner research director George Shiffler said Tuesday.
But Shiffler wasn't quite willing to go as far as Otellini did. He expects second-half shipments to be flat from last year, not strong enough to push the entire year into positive territory.
"It wouldn't surprise me if we did see positive growth, but that's not our call at the moment," Shiffler said. "I think that's more of a best-case scenario."
Intel shares rose 8 cents to $19.62 in afternoon trading. Shares of rival Inc. gained 30 cents, 5.2 percent, to $6.11.
Among PC makers, Dell Inc. fell 28 cents, 1.8 percent, to $15.73, while Hewlett-Packard Co. gained 68 cents, 1.5 percent, to $47.03.
Since Otellini proclaimed in April that PC sales had "bottomed out" after a miserable holiday season, he has been more aggressive in his forecasts than even Intel's biggest customers. That has raised questions about how much of the recovery in Intel's sales has been caused by computer makers restocking depleted chip supplies and how much has come from end users buying more machines.
Still, Otellini's remarks Tuesday help explain Intel's decision last month to raise its third-quarter revenue guidance to $8.8 billion to $9.2 billion. The previous range was $8.1 billion to $8.9 billion.
Otellini also used his presentation at the Intel Developers' Forum to show off chips built on so-called 22-nanometer technology, which refers to the ever-shrinking size of circuitry on the most advanced chips. Those chips are still being developed in Intel's factories and won't go into production until 2011.
Each chip on the silicon "wafer" Otellini showed off has 2.9 billion transistors. Intel's first chips in the 1970s had just a few thousand transistors.
_______________________________________________________________________________
Powered by Conduit

Sirikot Radio Online

Blogger templates

Your Ad Here