Tag Archives: search engine marketing

Building an architecture for innovation

Architectural techniques for improving search engine rankings

Devin Bost

March 8, 2014

One of the great challenges for website developers is providing more content with less time. As languages, frameworks, and platforms continually evolve and become more advanced, it can be easy for a developer to feel somewhat lost among the myriad of options available. Questions that a developer might be faced with are, “How should I spent my time? What should I be learning now? And how will I know if I’m moving in the right direction?” Faced with many technological challenges, it can sometimes seem overwhelming for a developer to sacrifice the time required to add content to the website. When I say “content,” I am referring to articles, FAQs, and other information that will be attractive to search engines and visitors of the website.  Time focused on adding content may take away from time that could have been spent improving the infrastructure. Even after massive time spent writing articles, it is sometimes humbling to see your website only barely begin to obtain modest positions on various searches across the internet. Suddenly, when the web developer realizes that they need to spent time on infrastructural changes, such as preventing spam from filling up their contact forms, it can be very disappointing to see search engine rankings rapidly drop. So, you may ask, “What techniques are available to obtain sustainable growth of the numbers of visitors that find my website?” To this question, there is one ultimate answer: architecture.

I will address this topic by starting with a parable. Consider two architects, not software architects, but the kind that construct buildings. The first architect rushes through the design, obtains capital through loans, and quickly leverages available resources to start constructing the beams and walls of the building. The second architect spends much more time in the early design stages. The second architect ensures that every pipe, every room, every door, and every last possible detail are considered. The second architect performs a thorough evaluation of the climate, soil, and risks of major disaster. The second architect doesn’t begin building until absolutely certain that the building has a firm foundation, a foundation that will not fall. Now let’s jump ahead in time. The first architect has constructed nearly half of the building, but he now realizes that there is a problem with the layout of some of the plumbing. To continue with construction, the top half of the building will need to be redesigned. The amount of money required to perform these design changes will be massive, and without starting over, certain structural consequences of the redesign will render the building weak to natural disaster. This building has been built on a sandy foundation. The second architect, however, was much more careful. Every decision was made with the utmost analysis and planning. As a consequence, the foundation was constructed with the future in mind. The construction was able to occur with much greater organization and cost savings. Once the construction gained momentum, milestones began occurring ahead of schedule. The second architect’s building was built on a sure foundation.

Software architecture has many similarities to the architecture involved in commercial buildings. With a good software framework, the developer may accomplish much. I have seen many developers write web applications in PHP only to later realize that it would be very complicated to build software applications that interoperate seamlessly with their website code. Nonetheless, the key to obtaining sustainable rankings in major search engines depends not upon the content you deliver, but instead upon your ability to allow your users to create content. Think about some of the largest and most successful websites on the internet. How many of them became successful without providing an interactive service? The best software architecture is designed with usability in mind. Is your website simply acting like a billboard? Or does your website provide a service that your users need to have? You must consider what the motivations of your users will be. Many websites want to tell their users what to do and what to believe. But far fewer are the websites that listen to their users and act upon what their users say. Take some time and think about the ten most successful websites or web technologies that you have ever seen. How much of their content is generated by their users?

This architecture is the key to obtaining sustainable search engine rankings. Without good architecture, your site is built upon a sandy foundation.

 

Advertisements

Reaching physicians with complicated patients

Creating an effective AdWords advertisement

Reaching physicians with complicated patients

Reaching physicians is somewhat of an art. Physicians tend to be extremely busy, and due to their lack of available time, they don’t want to waste their time by listening to some company try to sell them something that they think probably won’t help them. Physicians want to improve patient care, but they don’t want complex software or significant learning curves. To understand how to reach physicians, we must use fewer words. We must also understand what is important to them. This rule is true with any form of marketing, but it is especially true with doctors. Here’s what’s important to physicians:

  1. Improving patient outcomes and saving lives;
  2. Saving time;
  3. Preventing mistakes.

The background of specialty medicine

It is not uncommon, particularly within specialty medicine, for a provider to have a patient with complex multi-drug regimens that may be completely unrelated to the specialist’s area of expertise. Unfortunately, however, it is quite common for drugs with very different domains of pharmacotherapeutic effects to interact in unexpected ways. If a doctor knew in advance that the drug they intent to prescribe, for example, would cause a potentially lethal change in the level of an enzyme that is acted on by an unrelated drug that the patient was taking, then the physician would certainly choose an analogue of the drug, choose an alternative treatment option, or do something other than expose the patient to the risk imposed by the dangerous combined effect.

The vision of drug interactions

A drug interaction tool would provide exactly this service. It would allow physicians to restructure a patient’s drug regimen to ensure that desired therapeutic outcomes are achieved without serious adverse short or long term consequences. Such a tool would also save the physician considerable time that would otherwise be spent trying to investigate the issue by digging through literature and hoping to find the desired information. Such a tool would also save patients considerable money and time, prevent unneeded doctor visits, labs, and tests, and improve their overall satisfaction with their medical care. Although physicians may like to think that they don’t make mistakes, we all know that we are imperfect; therefore, a drug interaction tool would help doctors detect errors and prevent mistakes that could yield harm to patients and costly litigation.

Creating the ad – an invitation without false claims

Unfortunately, in this digital age, there are multitudes of technologies available to the provider that falsely claim to save physicians’ time, reduce medical errors and costs, and improve the quality of medical care. Very few of these technologies are able to justify their claims, and even fewer of these technologies are able to prove the legitimacy of their justifications. Like most of us, physicians really want increased intelligence. They want answers to questions that they haven’t asked yet, but they don’t want the answers until they are ready for them. They want a tool that improves their clinical decision making accuracy, but they don’t want to sacrifice countless hours of their time only to determine if they are wasting their time or not. By developing a service that is free to try and has a very simple interface, physicians won’t need to waste their time.  They can try the tool, and if it helps them, then they will be happy; if it doesn’t help them, that’s fine – they will have wasted very little of their time and none of their money. We try to convey our message in advertisements like this one:

Druginteractionschecker.com

Predict metabolites’ effects.

Improve patient outcomes.

This ad is a simple invitation to come and see. When the free app is ready, we will adjust the ad accordingly.  This ad reflects what the service offers: It provides predictive intelligence that improves patient outcomes. In general, from a marketing perspective, reaching our target audience requires that we use the language that our customers are most likely to use. This is precisely what we have done.

 

Search Engine Optimization techniques

Devin Bost

2/23/2014

One may ask, “How do we measure the results of our search engine marketing/optimization?” Measuring data requires Google Analytics. In this article, we will assume that Google Analytics has already been configured. According to what data Google Analytics provides, the process for improving site metrics is as follows:

  1. First, we setup filters.  We use filters to isolate traffic in the following areas (in order of importance from highest to lowest):
    1. Organic search results;
    2. Paid search results (this only applies when using pay per click advertising with Google AdWords);
    3. Unique new visitors from non-search engine sources.
  2. Second, we set our metrics (external variables). We should setup different metrics for each advertising campaign. We should also setup different metrics for organic search traffic. There are several benchmarks (variables) I like to collect data for:
    1. Number of unique visitors, or unique visitor count;
    2. Unique page view count, partitioned by ;
    3. Hit count, partitioned by landing page URL (filtered to display only pages generating one or more unique visits);
    4. Hit count, partitioned first by keyword phrase (the search term used to land on a page); then, partitioned by landing page URL (the URL the search brought them to);
    5. Relative position of ranked pages on Google, weighted according to their position (with an exponential decay model I developed);
    6. Return visit count, partitioned by IP address;
    7. Bounce rates:
      1. Partitioned by keyword phrase, then landing page URL, then by number of internal links (aka layer count) clicked on;
      2. Partitioned by landing page URL, then keyword phrase;
    8. Visitor count, partitioned by backlink URL. These are visitors that landed on our site by following a link from someone else’s website, and according to (Brin & Page, 1998), backlinks have been important since the creation of Google’s search algorithm.
  3. Third, we set our internal variables. These are what we generate internally. This technique becomes invaluable once our external variables begin exhibiting acceleration; then, we may use mathematical techniques to gain insights into how our changes to page content (internal variables) affect our external variables. It is very important that changes to site content are tracked. It can become very hard to assess rankings when it is unclear which version of a particular page was responsible for obtaining a top ranking. For this reason, it is very important that revision control is tracked across the site. HTML tags must be analyzed and tracked. Here are descriptions of how these are used:
    1. Title tag: It defines the page title and it communicates what the page is about to the search engines. The target keyword must be included in this tag. It is displayed to Google search users, so it is important that we apply some practical psychology here;
    2. Description meta tag: It provides a summary description of the web page and its contents. Also, this description appears (in most cases) in Google search results, just below the title; target keyword must be included in this tag;
    3. URLs: An optimized URL is one that is self-explanatory and self-documenting; target keyword must be included in this tag;
    4. Heading tags: These tags are used to emphasize important text and a hierarchical structure of keywords on the web page. Heading tags also inform the search engines how the page content is organized. The <h1> tag defines the most important keywords and <h6> defines the least important keywords;
    5. Keyword placement: This data will be more relevant when we start clustering keywords for strategic optimization on keyword stems. There are several techniques which may be used on this data, depending on how we implement clustering. We can use neural networks and natural language processing for this later on. Using language processing techniques is very easy when content is stored in a database that offers out-of-the-box text processing features;
    6. Content keyword density: According to (Parikh & Deshmukh, 2013), search algorithms place great emphasis on keyword density. It is important that targeted keywords have greater density in the content involved;
    7. Use of robots.txt: The robots.txt file gives directions to search engines regarding which pages or directories should be crawled. Having this file configured correctly will make sure all optimized pages will get indexed;
    8. Images: Using the image alt attribute to provide an accurate description of the image being used; target keyword should be used in the description, if possible. The alt attribute from the <img> tag specifies the alternate text of what the image contains if the image is displayed incorrectly or doesn’t load. It is also used by content readers for people with disabilities;
    9. Use of the “rel=nofollow” attribute: In a HTML anchor tag <a>, the rel attribute defines the relationship between the current page and the page being linked to by the anchor. The nofollow value signals web spiders not to follow the link. In other words, it tells Google that your site is not passing its reputation to the page or pages linked to;
    10. Sitemaps: Keeping the sitemap updated is key for good site rankings. Search engines depend upon sitemaps to tell them what the current web pages are for the website;
    11. Time interval. This is the frequency by which we take measurements. Monthly is fine initially. Once we have enough data to observe our rates of change, we can change our interval to weekly;
  4. We will track internal links and external links once we have traffic that doesn’t bounce. We will discuss this more later. External linking is considered off-site SEO. Important factors, although rather difficult to track are:
    1. Keyword in the backlink: Google’s ranking algorithm places high value on the text that appears within the link. The text within the link gets associated with the page and describes the page it links to. For this reason, it’s important to have the target keyword within the text of the backlink;
    2. Gradual link-building: It’s important to build backlinks in a gradual manner. The link-building process should be natural and steady. It is for this reason that SEO takes a lot of work and patience to implement Furthermore, it’s an intentional part of Google’s strategy for gradual reputation building that it not be quick or overnight. In fact, if a site were to acquire dozens or hundreds of backlinks overnight, Google would almost certainly consider this a red flag (spam) that most likely will get your site penalized. But if the site content is compelling, people can find it through search (or through other means) and link to it. If this occurs, the site owner has no control of the amount of backlinks that the site will generate, and certainly Google can detect that these backlinks weren’t intentional.
    3. Writing articles to establish domain authority: Writing articles and getting them published on other reputable sites, is a strategy that can help your site get backlinks. Getting an article published on trusted sites such as About.com, Wikipedia.org or NewYorkTimes.com and getting a backlink in return, will help increase your website’s reputation and achieve higher rankings;
    4. Personal networking to establish a reputation: It is recommended that we make efforts to reach out to those in the site’s community, particularly sites, “that cover topic areas similar to [ours]. Opening up communication with these sites is usually beneficial.” (www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf) Contacting sites that are related to what your site is about is a great way to network, promote and increase your site’s exposure;
    5. Finding your website’s natural affinity group: Find websites that are related or cover similar topics as yours for potential networking opportunities. Getting backlinks off-topic sites do not count as much as links from sites that have related content to yours.

References

Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine. Computer networks and ISDN systems, 30(1), 107-117.

Parikh, A., & Deshmukh, S. (2013, November). Search Engine Optimization. International Journal of Engineering Research and Technology, 2(11), 3146-3153.