Showing posts with label development. Show all posts
Showing posts with label development. Show all posts

Wednesday, 22 October 2014

Windows 8 and Windows 8 RT - Issues and Workarounds to get Windows 8.1 behaviour

Getting around Window 8's issues

By Strictly-Software

I use Windows 8.1 (proper version - thank god) but if you are stuck on Windows 8 or the Windows 8 RT version you should upgrade to Windows 8.1 ASAP as the design of Windows 8 was so that they could use the same display format for both phones, tablets, PC's and laptops.

The problem was that the techies didn't want numptified hidden away features behind a design obviously meant for touch screens and not for coders.

So they introduced Windows 8 RT which was a sort of half way house between Win 8.1 and Win 8 as they realised they were going to have another Vista issue where people hated the OS and many skipped over it or stayed with the previous version anyway Windows XP (I know I did!).

So when they realised they had another Vista/XP issue on their hands they rolled out Win 8 RT which tried to get rid of the main tablet screen and have am option to get back to a normal desktop screen.

Windows 8.1 was a total reverse of Windows 8 with the desktop as the main screen and the horrible tablets accessible by the home button.

As the article says:

If your PC is currently running Windows 8 or Windows RT, it's free to update to Windows 8.1 or Windows RT 8.1. And unlike previous updates to Windows, you'll get this update from the Windows Store.

by following this Windows Knowledge Base article: upgrade from Windows 8.0 or 8.1 RT to Windows 8.1.

Making some changes

The first thing I hated about the OS was that the mouse would change the size of the screen without me asking it to. In Chrome or elsewhere I was constantly manually putting the screen size back to normal ratio 100%.

In fact the mouse was moving so fast I couldn't hardly see it and I needed to make some changes. The list below are those I made.

Increasing the size and changing the colour of the mouse

I changed the colour of the mouse using the "Pointers" tab where you can pick the group of pointers you want to use to "Windows Black (extra large)" in the top selection and in the bottom I chose "Normal Select".

     

Now I could see the cursor much easier due to the colour change as most programs have a white background so a white cursor with a black border is not very helpful.

Changing the auto drag that the mouse does

One of the annoying things about Windows 8+ is the auto drag ability of the mouse which enables you to highlight or drag features without using the CTRL key.

You can turn this off in the "Buttons" > "Click Lock" panel by de-selecting the "Turn on Click Lock" feature.

This was one of the major pains I had when I first used Windows 8.1 especially when I was remote desktopped into my work PC. Then I found a way to turn it off.


Changing the mouse controlling the size of the screen

This is one thing that really pissed me off. When I was surfing the net or just writing a swipe of the mouse would change the whole screen size.

Other Mouse Changes

I also did the following changes in the "Pointer Options" tab I changed the pointer speed to a slower one.

With a bigger black pointer it helps a lot. I also did the following.


  • I chose to "Automatically move pointer to default selection in a dialogue box". No point hunting for CTRL keys is there! 
  • I chose to "Show location of the pointer when the CTRL key is hit" 
  • A nice circle appears helping you find the cursor. 
  • I disabled the "Hide pointer when typing" option so that I can always see the cursor.

Other Windows 8.1 Programs

With all the numptyfictation going on with the Windows tablet format there are some actual applications that can help you without going command line.

Even though the start bar is pretty crap you can either use the "Search" function to find an app or just go into the app page and start typing. The words will just start to appear in a search box in the top right of the page e.g "Paint" to get the Paint application.




The Windows App Store is one of the great features that Windows 8 has introduced and there are lots of great programs you can download for free including "Live TV" to get TV from freeview channels as well as US and European TV shoes.

With WiFi and network connectivity you want to check that your broadband speed is fast and everything is okay. One of these programs is "Network Speed Test" which you can get from the App Store.

This application tests your broadband or WiFi speed and tells you whether you would have the capabilities of downloading high or low quality video, your internet status, network details and other ISP related information.

So now you know how to get Windows 8.1 features, change some annoying ones in the OS related to mouse usage and use the Windows App Store to download apps to test your WiFi for you.

You should now be on your way.

Tuesday, 21 February 2012

Debugging on the iPhone

Debugging Console on iPhone

I have an iPhone 3GS and one thing I have always found annoying is that when I am developing web pages for display on the iPhone you cannot debug them as easily as you can with a PC.

Yes you can get simulators and use user-agent switchers but they are not the same as debugging on the real device.

If I use an agent switcher and change my user-agent to GoogleBot or IE 6 that isn't got to simulate a web page that tests for agents and devices properly or handle libraries that test for unique iPhone / iPad features available on the device and loaded in by specialist external libraries such as the two finger scroll etc.

A proper coder will test for a BOT in ways that don't rely purely on the user-agent including checking for known BOT IP addresses that the they crawl from, a reverse/forward DNS check or tests for JavaScript or Flash use or all 3 and a myriad of other techniques.

On the iPhone you can easily test whether a user-agent switcher is being used by creating a flash movie and then testing if it's been loaded as we all know iPhone doesn't support flash so a user-agent sniff for "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5" is useless for simulating an iPhone and it's workings properly.

Therefore knowing whether there are JavaScript errors on the page or wishing to output debug messages to the console on an iPhone which you can do easily in all other browsers is a very useful thing to be able to do. I didn't actually find this out until this morning but on the iPhone you can actually do this by going to your settings menu option and enabling it with the following steps.

Go to Settings > Choose the Safari Option > At the bottom of the menu select the Developer option > Enable the "Debug Console" option.

If you then go to any web page you should see at the top of the page will now appear a new panel titled "Debug Console" and if there are no errors on the page it will say "No Errors". If there are errors but no output to the console (e.g with a console.log(msg) function call) then it will list the number of errors and if there are console messages it will tell you the number e.g "7 Logs".

Selecting the console option will show a new screen in which you can view all the console messages. At the bottom of the screen are options to view "All messages", "HTML", "JavaScript" and "CSS". Clicking the relevant tab will show you messages related to those errors.

If there are JavaScript errors it will show you the line number, the error message and a description but unlike a proper browser you cannot click on the error message and view the actual source code.

In fact that is the one bug bear I have about iPhone's Safari browser which is the lack of developer options such as being able to view the source and generated source code as well as controlling options such as white/blacklists for 3rd party cookies, JavaScript and so on. However at least you can view the errors on the device properly if you need to.

Wednesday, 31 August 2011

Would you switch back to IE9 now that they have finally made a good browser?

Now that IE9 is standards compliant and actually good will you switch back to using it?

A few months ago I wrote an article about why I hated debugging in IE8 which revolved around their single threaded event model (window.event) and the delay it took before debug messages were outputted to the console.

A member of the Microsoft team commented that he had run the test case I had created in IE9 and that it had run perfectly. No high CPU, no frozen console, no high memory usage or other performance problems at all.

As you cannot install IE9 on WinXP, which is what I use on my home laptop (due to Vista being shite), I haven't had the pleasure of using Internet Explorer 9 a lot until I installed Windows 7 on my work PC.

I have to say that Windows 7 is actually a great operating system and I especially love the way it has incorporated many of the features of clean up and performance tuning tools like Tune Up Utilities and CCleaner into the main OS.

I also have to say that Internet Explorer 9 is the first IE browser I actually like.

Not only is it blindingly fast they have finally listened to their customers who have been complaining for years and made their JavaScript engine standards compliant.

Obviously this makes the Browser / Document mode spaghetti even harder to detect and I haven't been able to find a way as of yet to detect IE 9 browser mode running as IE 8 or 7 on 32 bit machines but that is not a major issue at all.

What I am wondering though is that now that IE9 is actually a good browser, with a good inbuilt development console and element inspection tools, how many developers will actually return to using it either as their browser of choice for surfing the web or for their primary development.

My browser development survey which I held before IE9 was released showed that developers would rather use Chrome or Firefox for surfing and would also always choose the development tools that those browsers bring than use IE 7 or 8.

I know that I changed from using IE to Firefox as my browser of choice some eons back and I changed from Firefox to Chrome for both surfing (speed is key) and developing (no need for Firebug or any other plugin) the other year when Firefox started to suffer major performance problems.

These performance problems are always either to do with having far too many plugins installed. Setting the browser up to check for new versions and any installed plugins on start up which cause long load times, Firebug issues and errors in the Chrome source. This is on top of all the constant problems that Flash seems to bring to any browser.

I use Chrome for surfing due to it's speed but if I leave a few tabs open all night that contain pages running Flash videos then by morning my CPU has flat-lined and memory has leaked like the Titanic.

This is more a problem with Flash than Chrome and I try to keep this browser for surfing as clean and as fast as possible by not installing plugins. Then if I need to hack around or do some hardcore web development I will use Firefox and all the great plugins when I need to.

I don't think I could really get hacking with Chrome alone as when I am really getting stuck into a site I need my key plugins e.g the web developer toolbar, hackbar, header modifier, Request and Response inspectors e.g HTTP Fox, Firebug, ColorZilla, YSlow and all the Proxy plugins that are available.

However for basic development, piping out messages to the console, inspecting the DOM and checking loaded elements and their responses then Chrome has everything Firebug has without all the JavaScript errors.

In fact it's surprising how many developers don't even realise how much is possible with Chrome's inbuilt developer tools, speed tests, header inspectors, element inspectors and DOM modifiers and other great debugging tools that used to be the preserve of Firebug alone.

Which leads me back to IE9.

Yes it's fast and it's developer tools are okay with options to clear cookies, disable JavaScript, inspect the DOM and view the styles but it's no Chrome or Firebug yet;

Therefore what I want to know is how many people (and by that I mean developers) will switch back to using Internet Explorer 9 for pure web surfing?

For sure it's fast, supports the latest CSS standards and all the great HTML 5 features that we have all seen with floating fish tanks and the like. But is this enough for another browser switch?

I have all the major browsers installed at work on Win7 including IE9, Firefox, Chrome, Opera, Safari, SeaMonkey and a number of others I cannot even remember the names of they are that obscure.

Before upgrading I even had Lynx and Netscape Navigator 4 installed as I used to be so anal that any code such as floating DIV's had to work in the earliest browsers possible. However I only use these browsers for testing and I stick to Chrome (for surfing) and Firefox (for hacking) which leaves little room for IE9 at the moment.

I know it's a good browser. It's standards compliant and it no longer crashes when I try to output anything to the console so why shouldn't I try it again.

Maybe one of my readers can help me decide on the browser of choice for surfing and development and how IE9 fits into that toolset.

Maybe I am missing some really cool features that I would use all the time but to be honest I am not into throwing sheep at randoms or constantly voting sites up and down and writing comments on them. Therefore for me to switch back from Chrome to another browser such as IE9 there has to be a really good reason for me to do so.

So does anyone have any reason why I should re-consider using IE9 at work for my web surfing or development?

Sunday, 23 January 2011

Strictly Software - Online CV

Are you looking for an experienced Web Developer with 25 years experience?

If you are a company or individual that requires an experienced systems developer for bespoke development work on websites, databases, scripts, plugins, bots and security then you should consider contacting Strictly Software for a formal quote.

Over my 25 years of development experience I have managed to acquire a wide range of technical skills that are very relevant in today's fast moving Internet world. A search on google.com will return many examples of my work from my publicly available free online tools and popular blog articles, to a number of popular WordPress plugins.

My skill-set covers everything from large scale database design, development and performance tuning to website development, auto-blogging, user tracking and black or white hat SEO techniques. A non-inclusive list of my skill-set is listed below.
  • Database development and administration using MS SQL 6.5 - 2012 and MySQL.
  • Performance tuning including index optimisation, query plan analysis and caching.
  • Development of relational database systems, real time systems, EAV hybrids and systems built partially with automated code.
  • A good knowledge of system views which I have used to create a multitude of scripts to help locate, update and port data and clean up systems that have been hacked with SQL injections.
  • Automated reporting, analysis, diagnosis and correction scripts.
  • Front end development in C#, ASP.NET, PHP, Java, VB, VBA, ASP Classic and Server and Client Side JavaScript on a number of commercial and personal sites as well as a number of intranets.
  • Cross browser script development and the use of unobtrusive and progressive enhancement scripting techniques.
  • XML, HTML, XHTML, RSS and OPML parsing.
  • Web 2.0 development skills including RPC, RESTful methods and good knowledge of the issues surrounding internationalisation and multi byte character sets.
  • AJAX, JSON and using Object Orientated JavaScript.
  • Intermediate CSS skills and good DOM manipulation skills.
  • Good knowledge of writing bots, crawlers and screen scrapers.
  • Experience of hooking into popular API's such as Twitter, Google or Betfair.

Not only have I developed a number of useful PHP plugins for Wordpress including:

  • Strictly AutoTags, an automatic tagging tool that uses keyword density and HTML tag analysis to find appropriate words to be used as tags for articles.
  • Strictly Tweetbot, an automated content analysis plugin that uses OAuth to send tweets to multiple Twitter accounts.
  • Strictly System Check, a report plugin that checks for site uptime and runs a number of automatic fixes depending on identifiable problems.

I have also created a number of web tools using PHP which included:
  • Super Search - An anonymising search engine that allowed you to search the top 3 search engines in when one of my proxies was stopped however it was a test to prove it could be done and I used my own language that I had created called SCRAPE.
  • MyIP - A browser connection analysis tool.
  • WebScan - A tool to scan a webpage and find out key info such as malware, trackers, outbound links, spam ratings, DNS checks and much more.



Buy Now


As well as purchasing these PRO Twitter plugins I have numerous free to use plugins and scripts, tools, and online programs that let you do all sorts of things like HTML Encode JavaScript and De-Compress multiple times packed code using Dean Edwards packing code mechanism.

I have also worked on a number of client side tools based on JavaScript and AJAX including my Getme.js example framework which I use alongside Sizzle for DOM manipulation.

I have personally identified a number of major problems with the common frameworks such as JQuery and Prototype which is why I do not use other peoples code unless I am required to and I have wrote a number of articles about these problems which can be found here:



I also specialise in writing tools for automatic HTTP parsing and screen scraping and have build a number of objects in C#, PHP and Javascript to enable me to scrape and reformat articles on the fly. I used to actually have a fully automated business which randomised data to create seemingly specific emails, and I created a brilliant looking website that hooked into Betfairs API to show recent results, upcoming races and market price. I also showed Racing information and news articles obtained from various sites using my own SCRAPE BOT and had fully automated daily optimisation and maintenance as well as SEO optimisation and automatic advertising when new articles were imported, optimised before Tweets and Facebook/LinkedIn/Tumblr and other social medsia posts were automatically sent out to bring in traffic.

As someone who has to maintain a system that has millions of page hits a day I am constantly engaged in a battle between malicious bots and users who are trying to either hack my system or steal content. Therefore I have built up quite a large amount of knowledge on the best practices when it comes to identifying and preventing bad users and this information also comes in use when I have to make automated requests myself.

I have also written a number of articles on the topic of bad bots and scraping as well as how a bot should go about crawling a site so that it doesn't get banned including writing an example C# object to parse a Robots.txt file.

I also run a number of personal sites that have increased my own personal knowledge of successful SEO strategies and I have authored a number of articles about tips and tricks I have come across that help in this regard:




As well writing over 100 technical articles for this blog I have developed a number of popular and useful tools such as:

  • Twitter Translator - One of the first sites to offer translation of Twitter messages on the fly in all the languages Google Translate offered. This had to be shut down due to Twitter changing their open API to a closed OAuth system.
  • HTML Encoder and Decoder - This very popular tool is based on a free object I have created that allows users to HTML encode and Decode in Javascript. Unlike similar scripts my tool handles double encoding which is a big issue when you are handling content taken from multiple sources that may or may not have been merged with other content or already encoded.
  • Javascript Unpacker and Beautifier - This popular tool is used by myself and many others to reformat compressed or packed JavaScript code and to reformat it into a readable format.

I used to have an APACHE server which I don't have anymore unfortunately as is had a a number of great tools and other sites including my own URL Shortener, a Spam Checker and a Pirate Bay Proxy checker script which used to be easily accessed on my www.strictly-software.com site which due to unfortunate circumstances, was moved to a French hosting company, OVH, who have blocked the owner from accessing it.

Due to crazy circumstances my site was hosted, along with it's SSL certificates on their server, which held a horse racing site I part owned along with my own site. The horse racing site got sold, the owners built a new site on a different server but they cannot stop OVH from charging them monthly fees for Windows/SQL Server/Hosting etc due to a change in their login process where they now require people to login by accessing a link sent out to a specific email address. 

I have no access to the address, nor do the new owners, as the old owners deleted it and it cannot be recreated. Trying to speak to OVH just results in them saying that they cannot change the email due to EU Data Protection laws. So this company is paying a huge monthly fee to keep a server running which only hosts my own site that I cannot even access to add a new SSL, or move the DB and code to another webserver. So that is why you don't get an https version of my site at the moment!

I used to work at a well known UK software development company based outside London.

I worked for this company for 11 years and during that time I was the architect of 3 versions of an award winning job recruitment software product. The software I developed ran over 500 different job boards and consists of custom code (client and server) that was written from the ground up using Microsoft technologies.

The generic codebase I have created and the back-end management system that controls it allows for extremely fast production times. Also because of the generic design the main time delay between specification and deployment for each new site is actually the front end design and not the database or business layer development as it usually is with other systems.

I actually automated the process of creating a new site so that the non techies who do the project management could easily follow a stepped process to create a new job board by filling in a series of forms. These forms asked for information such as the new URL, site name, any existing site to copy their settings and categories from, as well as a site to copy the design from which could then be changed.. 

I then automated the process of creating new database records, copying data from existing sites to the new one, updating IDs and linked lists, building text files such as ISAPI rewrite files and constant files, setting up folder structures and even creating the site in IIS so that at the end of the process there was a fully functional new site ready to be tweaked by the "colouring in" people e.g the designers LOL.

Using a mix between normalised relational structures and EAV for custom requirements alongside automatically generated de-normalised flat tables and matrices for high speed searching, the system I created perfectly straddles the sometimes problematic trade off many developers have to make between quick development, good solid design and high performance.

The skills I have learnt developing this software has proved that is possible to maintain high traffic systems on Microsoft architecture utilising legacy technologies such as ASP Classic along with a number of .NET web services and console scripts to send bulk emails out, handle incoming spam mail, importing jobs by XML and many other external tasks. 

By developing this system I learnt the following skills and techniques such as:
  • Database performance tuning.
  • Application layer performance tuning.
  • Developing highly secure websites.
  • Automated traffic analysis and real time logging, visitor fingerprinting and automatic banning techniques.
  • Object orientated and procedural development methodologies.
  • Caching, Minification, compression and other optimisation techniques.
  • JavaScript widget development, including a custom WYSIWYG editor, my own framework and many animated tools including geo-graphical searching using Googles Map API and our own data.
  • Automated tasks to report, analyse and fix potential issues.
  • Good coding practises including limited 3rd party COM object use, object re-use and other well known but sadly untaught tricks of the trade.
  • C# .NET coding for various external scripts that vastly sped up processes such as bulk maling job alerts to thousands of users when compared to old VBScript techniques.
  • Use of NoSQL DataBase's such as DTSearch for indexing our jobs and CV's and providing a fast search of the files using their COM interface.

A cursory look over this blog and my very old site www.strictly-software.com (which I cannot actually access anymore due to the OVH server issue mentioned earlier), will show you the wide variety of skills that I am trained in and hopefully the depth of knowledge that my articles and examples deliver prove that I know what I am talking about.

If you are interested in learning more about my work or are looking to hire someone for web or database development then you should contact me for further details.

If you are interested in having a performance audit on your legacy systems before considering whether to rewrite or tune up then you can also contact me at the details provided in the contact link in the footer.