This article talks about why and what content is most beneficial to website designers and their websites. I agree with the author’s point about it being essential to invest in quality content. The volume of websites literally grows by the hundreds and thousands daily. It’s not enough for your website to website to just stand out. Sure, whatever made your website stand out drove the traffic there, but what keeps the visitor there or gets them to return? It’s the valuable content on your page that they’re looking for and actually want to read. Anyone can create a flashy website but by a few clicks into the site the visitor can see if it’s junk or not.
In brief summary the author says that if you invest into content like infographics, authoritative blog posts, standout opinion pieces, how-to-content, original research pieces, trending content, videos. I found a majority of these to be interesting, however not surprising. I wasn’t surprised because during this entire semester our professor (for my Web Design course) has been having us write blog posts and create infographics. I knew there was clearly a reason for doing them otherwise there wouldn’t be such an emphasis on them. My blogs lack but truthfully I do them to meet the course requirement. I could however see where they could be beneficial and useful to a website. However crafting a blog, a quality blog, takes a lot of time and mindfulness that right now, I just don’t have. The infographics however are very interesting and very challenging. It has to be done in a way that gets a message or story across with virtually no words. I did one but still could not make it near wordless (however there aren’t too many words). Finally I leave you with the reasons that the author suggests for investing in quality content is: high return on investment (ROI), supporting other strategies, diversifying your online presence, help you compete (against other similar interest websites), and it’s inexpensive (especially important).
In class earlier this semester SEO specialist Becky Livingston recommended that since we’re all working on websites for our finals, if we want them to be successful they should be secure.
Since the birth of webpages many of these pages have been HTTP (hypertext protocol, which is just words for technical communication and transport methods of data). Soon these pre-existing web pages as well as brand new webpages that will emerge will have to be HTTPS (HTTP Secure). The primary reasoning for making webpages HTTPS is for its authentication of the visited website and protection of the privacy and integrity of the exchanged data. HTTPS provides authentication of the website and associated web server with which one is communicating, which protects against man-in-the-middle attacks. Additionally, it provides bidirectional encryption of communications between a client and server, which protects against eavesdropping and tampering with or forging the contents of the communication.
The article I found on search engine journal said that a within 9 months (short time period in internet time) the percentage of HTTPS results on Google’s first page went from 30% to 70% (the other remaining 30% is split divided between advertising links and a few that are still just HTTP. There are still some major pages that haven’t made the switch, but pretty soon Google is going to be labeling HTTP sites “non-secure”. If an e-commerce website or site that requires personal information or credit card information but says non-secure, it’s basically a death sentence for that site, company, etc. So overall the change doesn’t need to be immediate, Google is giving users 6 months to become secure, but after that 6 month period sites will be labeled insecure. According to another link, it’s not known what Firefox or other browsers will do with the HTTP/HTTPS topic but Chrome for sure needs the Secure. Additionally Google warned that it will turn the browsing mode into “incognito” for those site that don’t switch to HTTPS.
Link to original article: https://www.searchenginejournal.com/half-googles-first-page-results-https-according-moz/195409/
Link to the additional info: http://www.zdnet.com/article/google-tightens-noose-on-http-chrome-to-stick-not-secure-on-pages-with-search-fields/
The latest article I found, “How Updating Old Posts Can Increase SEO Traffic”, is perfect for linking something I mentioned in my last blog and what I’m about to write in this one. In my last blog post I talked about social media optimization. No, this post isn’t quite about that. What I mean is that I mentioned one of my peers at Pace giving advice about creating content on social media, specifically the third tip: it’s okay to re-purpose/recycle older material.
This article on Search Engine Journal talks about it being okay, and actually encourages, updating old posts. This article lists several steps starting with what posts are even worth updating (which brings organic traffic on their own). Then it discusses what to flesh out, what to remove (old “anchors” or “links” that might be old or removed). Then following these first two important steps are additional suggestions that author Tom Demers recommends. His recommendations take looks at what posts from several years ago might be missing in today’s website world, especially in an environment that is constantly changing and developing 24/7.
I was interested in this post because I, myself, have some old blog posts that I’ve done two or three years ago on various blog spaces (wordpress and blogger). Which I’m sure now would not be relevant or links I referenced might be inactive. If they were to come up in someone else’s search result list and it was inaccurate or had bad links, the user would be frustrated with the site, leave angrily, and then my blog finds its way to the complete back of results pile (which can be pages upon pages). After reading through some of this I think that I may revise some older stuff I’ve posted (in what little free time I have, right now).
If you’re interested in figuring out what steps you need to take to update your former sites or if your site ever needs updated at all, check out the link: https://www.searchenginejournal.com/increase-seo-traffic-updating-thickening-old-blog-posts/191891/?ver=191891X2
This article is about 12 different and free social media tools. I figured most of the time the search engine journal has tips about how to optimize websites or web content when searched, so analyzing tips for more mobile and constantly changing content might be interesting as well.
Most of us in this class typically find our “interests for the day” from what’s trending on our social media feeds because that’s what people are age and younger are constantly going to for information, validation, and etc. It is one thing to scroll through and read the content on social media, but it’s a whole different thing to be the content creator. Beyond that doing it in a way that the information is catching enough that others want to share it, thus becoming viral.
There’s been one person that I’ve ever met, who (here at Pace University) has ever created viral content and/or reach double-digit thousand followers. I’ve heard her give three main pointers when it comes to creating viral content/being impactful on social media. (1) Know your audience – most of the time they’re the initial ones who are going to share or assist your post’s movement, (2)Figure out what’s trending right now –one easy way for your hashtags or posts to get noticed faster, (3)Don’t be afraid to recirculate old “material” or content – sometimes you’re just ahead of the curve and something you wrote months ago is more relevant today than it was then.
This search engine article gives tips by means of social media management tools, that are free to all – and who doesn’t love FREE. The 12 tools mentioned are: Mention, IceRocket, Addictomatic, SumAll, IFTTT, Google Analytics, Facebook Insights, TweetDeck, Rapportive, Swayy, Qzzr, Easel.ly. I wouldn’t dare sit here and repeat in detail what all 12 do, because they’re all different from one another. Like the profiles on the people they assist, these social media management tools do a variety of things, from retrieving all of your data from any (an every) social network a user has to connecting with people with similar demographics but while using a completely different application or website (i.e. information can be pulled for LinkedIn purposes but from gmail/google circles). The choice of which tool or tools to use is dependent upon the creator’s intentions with their content.
Check the link to find out more information and get linked to the social media management tools: https://www.searchenginejournal.com/12-free-social-media-tools/116841/
Just a few days ago I came across an article on The Search Engine Journal that warned of 6 on-page SEO sins that can get a web designer into major issues. We’ve discussed in previous classes about how to optimize our sites but in the best and most ethical ways possible.
I’m new to the web design business, by business I mean I’m on website trial one due to a class assignment. But regardless I’m trying to work through it as best I know how, just like my fellow classmates. I figured this might be a good one to share with them because we’re human we make mistakes, but it’d be better to avoid the mistakes the first time around.
The 6 on-pages SEO tactics to avoid were: 1. Keyword Stuffing, 2. Spammy Footer Links and Tabs, 3. Cloaking, 4. Internal Linking with keyword-rich Anchor Text, 5. Dedicated Pages for Every Keyword Variant, 6. Content Swapping.
Again, being new to the web designing realm I’m not familiar with all of these tactics but this article does a great job at saying what it is, how it’s usually done or how it’s caught, and then a solution to avoid the problem. So it’s definitely worth a read or at least a skim for anyone looking for SEO tactic management. Follow the link
One thing that I did find interesting about the article is that this article could be somewhat linked to an article I mentioned before this in “Great Google, now this?” I was looking to see which SEO tactic stood out to me the most and I found 6. Content Swapping. While I find creating a web site hard enough and I’d prefer the webpages associated with my website actually be about what they say they’re about, others seem to be changing their content after Google indexes their site. Google is really a machine trying to keep up with billions of users’ demands, so it is not perfect and while continuously refreshed, it can take a while before it realizes that a site initially indexed was changed.
This is where that previous article came into play. That article had an opt-out solution for snippet spotlight but there isn’t a way to opt-out of getting faulty search engine results because of content swapping just yet. The web designer does receive a punishment from Google once the issue is found, but if they’re that into changing up content they’re probably onto another swap before Googles realizes it’s gone. Again, as I stated in my previous blog it’s just something that we have to be cautious of while using search engines for results.
According to the article “Don’t read everything you see on Google”, Google users should be cautious of a few of the result snippets that tend to be spotlight posted at the top of the search results page, in a special box. The editor of the blog Search Engine Land, Danny Sullivan, found that the results spotlighted by the search are not all accurate. The article gives several examples to support this claim against one of the largest search engines of the internet age. From what I’ve thin sliced from the article the reason why it hasn’t changed yet has two answers. Answer 1: Issues like this would put Google under scrutiny and undercut itself in the search engine game. Answer 2: There’s a “opt-out” feature where users/visitors can turn off this snippet spotlight.
What’s that mean? To me, an avid Google searcher, I wonder how many times I’ve looked something up quickly and not taken the time to see if it was accurate answer. Am I right? I mean when we use a search engine to look something up, we’d like the answer quickly and correctly. Don’t get me wrong I’ve began checking some results but for the little “AD”, not doing a complete background check on the answer.
This doesn’t surprise me though. Most search engines tend to pull from the internet or large databases. Because we expect answers at top speed, it wouldn’t be possible for search engines to spit back results that fast without some kind of mathematics or algorithm behind the screens. This algorithm doesn’t make it 100% accurate and like most cases of statistics have a degree of independence/variance (or at least from what I recall in general stats). What did surprise me is the fact that Google is aware that their search engine does this snippet spotlight, yet their answer to the problem is for the user to “opt-out” vs the site changing their algorithms. While I can appreciate that there’s a lot more that goes into search engines that typing something in and hitting search, unless a user reads this article or is tech savvy they wouldn’t know there is an opt-out option. I’ve been using google for years and I just found out how to opt-out of the snippet spotlight, from research of my own.
All in all like most issues with the internet, exercise caution! If it doesn’t sound right, chances are you’re smarter than that result’s content and look at another result to see if the answer is validated. If you’re interested in reading more about the snippet spotlight or how to “opt-out”, follow the link: http://www.msn.com/en-us/news/technology/dont-believe-everything-you-search-on-google/ar-AAo0mfO?li=AA4Zoy&ocid=spartandhp
This week Google’s flagship app on released new technology updates, specifically for iOS, designed to delivery accelerated mobile pages (AMPs) to users during searches. Google’s technology marks the AMP pages with lighting bolts that have been familiar with AMP results, as well as visual scroll-able pictures (almost in a slideshow manner seen commonly on many websites). This update came in response to Apple news and other projects that are changing the way that people use search engines and, in return, the way the quality in which the information is returned. Its supposed to be a very transparent and open return from Google. The framework is becoming more and more heavily cached as more updates occur (to fix some of the roll-out’s “bugs”) so that Google can display the result right there in the results list (a more visually appealing way than just a link) and removes have to retrieve the information and site from the user and that frame.Its receiving has been average. It provides fast and efficient readability and allows sites to still keep their own incoming revenue with plugins like AD Sense and advert, however it is an absolute pain to delete or clear search histories (supposedly, I wonder what that person’s searching? ;] ).
In my opinion, I think this could be a real game changer for websites and their developers. They have to think of the content or their website postings now and specific tags, because with primarily a read only view it is less likely that users will get distracted on the images or the organization of the website. No longer is it all about looks, it’s about the information’s validity and accuracy, as it should be. It also says a lot about how we, the consumers, have such a high demand for valid, accurate, and speedy information gathering. If websites expect visitors to find them they have to keep up-to-date with what’s being used and what’s accessible, in regards to search engine and best optimization practices.
To read this article for yourself follow the link: SEJ Google’s AMP update article . For additional information about AMPs and this new update use either of the links: Distl’s brief AMP breakdown or AMP .
This is your very first post. Click the Edit link to modify or delete it, or start a new post. If you like, use this post to tell readers why you started this blog and what you plan to do with it.