General

How to fix Hibernate MySql connection timeout issue (solved)?

When we began creating an application using Struts2 framework with Hibernate and MySQL, we ran into a problem which was related to MySQL timing out its connection after a span of 8 hours when left unused. It conceived a lot of time at our end but we were able to nail down the issue at the end.

Steps that we had taken to correct the issue related to Hibernate MySql connection timeout problem are listed below

– Download Hibernate C3P0 and copy .jar files
– Set c3p0.properties
– Make changes to hibernate.cfg.xml
– Test MySQL connection timeout
– Hibernate, MySQL connection timeout related error messages
Continue Reading…

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

XML (Extensible Markup Language) Vs. CSV (Comma Separated Values)

CSV is flat file with the data separated by commas. If we needn’t have to establish a relationship in the data presented in the CSV file, then we can go with storing the data as CSV and manipulate the content for display in the web pages.

XML allows hierarchical representation of data. Data in XML is more readable when it comes to presenting the data. Data can easily be validated with XSD and can be accessed with a couple of lines of code. Huge advantage of XML is its flexibility to establish relationship in data.

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SQL Injection: Whitelist validation vs. blacklist validation

Whitelist validation refers to data validation techniques such as checking the data type, data length, input range, nature of data by its format (for e.g. phone number will be ten digits separated by hyphens satisfying the format nnn-nnn-nnnn). Regular expressions may be used for format validation in inputs.

Blacklist validation refers to rejecting data based on a know bad list filter. This is not a powerful solution as the list of possible exclusions will be many and it is difficult to implement all possible scenarios. Blacklist validation should be used in conjunction with whitelist validation but in cases where whitelist validation cannot be applied at least blacklist validation should be implemented.

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SQL Injection: Database Code Security in Programming

Dynamic SQL (concatenated sql string) is a coding practice where by the queries are built in the program and sent to SQL Server for execution. This can allow the code to be injected into the dynamic queries causing a damage to the database.

A good alternative is to use parameterised queries where placeholders are set for the variables. The possibility of queries getting infected with injected code is completely removed with parameterized queries.

In addition to parameterized queries, it is alway a best practice to sanitize the input parameters before using them in queries.

Also the data input should be encoded appropriately especially in case of dynamic sql usage and to apply appropriate encoding when extracted from the database to avoid cross-site script execution.

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

PRG – Post Redirect Get design pattern to redirect after Post

Post Redirect Get (PRG) is one of the design patterns used in web applications to prevent double post or duplicate form submissions which often happens with a page refresh or reload.

When a form is submitted to the server with information, the server responds back with HTML content. When this content is directly displayed on the browser and when the user refreshes the page, the form gets submitted twice. As the content is a response to POST it cannot be bookmarked as well.

To avoid these problems, applications use PRG design pattern which will redirect users to a page instead of displaying the POST response.

More information on the PRG design pattern is available on Wikipedia

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SEO: Steps to take after submitting to search engines

Part 5: Steps to take after submitting to search engines (from the web)

Maintenance check

All Web sites should be thoroughly tested using a site maintenance tool in order to catch errors in operation before customers are brought to the site. HTML errors can hinder a search engine spider’s ability to index a site, it can also keep a search engine from reading a page or cause it to be viewed in a manner different from how it was intended. NetMechanic’s HTML Toolbox or another site maintenance tool, must be used by the Webmaster, to avoid potential visitor disasters due to site errors. Continue Reading…

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SEO: Steps to take before submitting

Part 4: Steps to take before submitting (from the web)

Fine tune the TITLE tag to increase traffic to the site

Improving the TITLE tag is one technique that applies to just about all the search engines. The appearance of key words within the page title is one of the biggest factors determining a Web site’s score in many engines. Changing title of the web pages to include some site keywords which works well with each of the pages can greatly increase the chance of the page getting noticed by the search engines and can increase the page rank of the site as well. Continue Reading…

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SEO: Factors that Affect Search Engine Ranking

Part 3: Factors that Affect Search Engine Ranking

o Domain Extension – New extensions are not always immediately recognized by the search engines. This was a problem for .cc and .biz sites in the early going.
o Sub domains – If your web site is ‘mysite.network.com, and ‘network.com’ has engaged in any unsavory search engine spamming, your site will be affected.
o Always get your own domain, even if you use a sub domain for your shopping cart, etc…
o IP Address/Range – This is a bit like the last point. If the search engines have had problems with many sites from one hosting company, they may degrade all the sites from that company’s IP range. It makes the hosting companies behave.
o “Domain in use since” The longer it’s live the better it’s generally viewed. Kind of a respect thy elder’s thing…
o Negatives That Affect Your Position Within The Search Engines
o Broken links – Internally and outgoing.
o Meta tag Stuffing – Having large bloat of text in Meta description or keywords.
o Irrelevance – If you use irrelevant keywords, description with regards to the content of the site.
o Tiny Text. – If you use text that is too small for the eye to see.
o Invisible Text – Text the same color as the background.
o Meta Refresh Tag Redirects – Where when you try and get to one page, but the address changes to a different one.
o Excessive Search Engine Submission – over submitting may get your site banned.
o Frames – Be careful when you use them. You need to embed key terms in them, because generally, the search engines can only see the frame, and not the primary content that you see as visible.
o Missing Alt Attributes – This is a mandatory code element for img tags and is viewed as bad coding.
o Compounded Words in the content, or tags will not help the web site for individual terms – i.e. – ‘specialriskinsurance’ as opposed to ‘special risk insurance’ or ‘special-risk-insurance’.
o Excessive punctuation in the TITLE and description tags – wastes precious space, and some characters are ignored or may cause a problem with the spider (the pipe “|” is a great one that should be avoided).

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SEO: What do Search Engines look for?

Part 2: Things Search Engines Look For in brief

~ Link Popularity (Check Page Rank in Google’s case).
~ How may other web pages around the Internet point to your web site?
~ Are these pages related to each other?
~ Are they considered valuable resources?
~ Anchor Text of inbound links
~ Does the link to your web site have relevant keywords in it?
~ Even if it is not directly relevant, a web page that is important that links to your site will still help your web site.
~ Presence on marked authority pages. (DMOZ)
~ Url quotation – i.e. when a page mentions the site by url but doesn’t link to it. This commonly occurs in news articles that mention web sites. While it doesn’t count as a link, it does count as a reference.
 Number of links on pages linking to this page. If the link to your web site is the only one from a page, it’s viewed as being more valuable than being one link among 100.
~ Freshness of links on pages linking to your web site. While the engines will count all links, a link from a web site that has not been updated in a year or two will be less valuable than from one that is updated daily. It indicates activity / interest levels.
~ Page Last Modified (Freshness) – just like the last point a page that is updated frequently is favored.
~ Reciprocal Links- Search engines like to see a closed loop – that a referring site as also used as a reference. So when you are giving away a link, ask for one back. It will help both websites.
~ Keyword frequency across all pages. Does the content really talk to the subject which the page and the web site is supposed to be about?
~ Keywords in the url: Using keywords in the url does have an effect for the search engine algorithms.
~ You can use keywords in the filename. For example if the page is about ford parts, then call it “http://www.sitename.com/car-parts.html” use dashes “-” and not underscores “_” to separate words in filenames.
~ Response Time – If your site is fast, it’s favored.
~ Server Downtime – If the search engine robot comes by and frequently can’t connect sometimes, they penalize your site.
~ Page Size – The engines tend to weigh content at the start of a document more than content further down. If a page is long, look at breaking it into sections. If a page is over 50k, then it’s too long.

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

SEO: What do Search Engines look for?

Part I: Things Search Engines Look For in brief

Each search engine weights each of these factors differently, and places the emphasis in different spots.

Search Engines Look For:

~ Title tag – You need a relevant title, not just “Home Page” and use it with keywords, description all across the site
~ Headings h1 … h6 – The search engines view < h> tags as being terms of emphasis – they give weight to the words within them. Put key terms in them.
~ Bold – Of lesser importance than < h> tags. the < b> tags still emphasize terms of importance.
~ Alt text – Use descriptive short sentences in your alt tags. If it’s a picture of a rose, and you’re a florist try “Red Rose – Available at ‘name’ Flower Shop”
~ Email addresses on page – if you put up an address, make sure the domain name in the address matches the web site domain.
~ Keyword meta tags – Some engines use them directly, some check them as part of a validation process to check if they match with the content.
~ Meta description tag – Most engines look at this tag. Use distinct ones throughout your site, and distinct ones for each page. Make them particular to that page.
~ Key term placement – Terms that are higher up on a page are more heavily weighted.
~ Key term proximity – Terms that are close together are probably related, and thus the site will show up in searches for those terms.
~ Comment tags – Some engines use comment tags for content. Most engines look for them in graphic rich / text poor sites.
~ Page structure validation – proper coding is likely to be of better overall quality, and thus rewarded.
~ Traffic/Visitors – The search engines do keep track of how many people follow their links.

Share it onShare on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn
Kurinchi Calendar
July 2017
M T W T F S S
« Apr    
 12
3456789
10111213141516
17181920212223
24252627282930
31