Sunday, March 1, 2015

How to hash Data/feed files for File transfers(sftp) using C#.net and Sqlsever 2014(Cryptography).

Microsoft .Net framework provided cryptography class object, These objects are usefully for encryption/decryption/hashing on sensitivity data files.(File transfers).

In the FTP world, Every organization transfer files to third party in a secure way using sftp protocols  with pgp /custom encryption  on data files.

Why Hash?

  • Hash plays a role in security systems where they're used to ensure that transmitted data files have not been tampered by hacker.
  • The sender generates a hash of the data file, encrypts it, and sends it with the Data file itself. 
  • The recipient then decrypts both the Data File and the hash, 
  • Recipient produces another hash from the received data file and compare sender hash and recipient generated hash
  • If they're the same, there is a very high probability that the message was transmitted intact.
For demo purpose I have created a console application in C#.net using System.Security.Cryptography class and  Tsql scripts in SQL Server 2014 
Algorithm SHA256

Demo

Step1:

A text file is created under C:\Hari\hari.txt and added "I am a good boy" as text.
Process generated hash on hari.txt file 
-- hash value with original value text ("I am a good boy")

F8C5F1FF1157F91FF64D1EFCA65B0A0318E84A21F72C3A6F5571E186A1D49D34

Step2:

next I opened  file C:\Hari\hari.txt and modified text inside "I am good boy, I am from Jamaica".
process generated hash on hari.txt file
-- hash value is changed when i modified a text file.
EC90609F3C3A58CA5DD499EB6833C4D66DA60E216A5A67BD9AAAFEB7133F6969


Step3:

when I compare both hash values, They did not match and possible tamper in the file.

the following code will give you an idea, How to generate hash.

Please find C# demo code here



using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Security.Cryptography;
using System.IO;

namespace ConsoleApplication1
{
    class Program
    {
        static void Main(string[] args)
        {
            //hash some test file
            string path = @"C:\hari\hari.txt";

            string checksumMd5 = GetChecksum(path, Algorithms.MD5);
            string checksumSha1 = GetChecksum(path, Algorithms.SHA1);
            string checksumSha256 = GetChecksum(path, Algorithms.SHA256);
            string checksumSha384 = GetChecksum(path, Algorithms.SHA384);
            string checksumSha512 = GetChecksum(path, Algorithms.SHA512);
            string checksumRipemd160 = GetChecksum(path, Algorithms.RIPEMD160);
            //print buffer value.
            Console.Write(checksumSha256);
            //wait for userinput
            Console.ReadKey();
  
        }

        //  create algorithm class
        public static class Algorithms
        {
            public static readonly HashAlgorithm MD5 = new MD5CryptoServiceProvider();
            public static readonly HashAlgorithm SHA1 = new SHA1Managed();
            public static readonly HashAlgorithm SHA256 = new SHA256Managed();
            public static readonly HashAlgorithm SHA384 = new SHA384Managed();
            public static readonly HashAlgorithm SHA512 = new SHA512Managed();
            public static readonly HashAlgorithm RIPEMD160 = new RIPEMD160Managed();
        }

      // hashing file by reading data in the file using stream readers
        public static string GetChecksum(string filePath, HashAlgorithm algorithm)
        {
            using (var stream = new BufferedStream(File.OpenRead(filePath), 100000))
            {
                byte[] hash = algorithm.ComputeHash(stream);
                return BitConverter.ToString(hash).Replace("-", String.Empty);
            }
        }
    }

}



T sql script in Sql sever 2014.

I used Binary_check sum and Hash functions.




Monday, February 9, 2015

Why Sql server Database Data file shrink is not a good practice?

Often in DBA world you hear shrink data files and log files. Data file shrink is not a good approach and leads heavy index fragmentation.

Let me demonstrate with some examples. Here I am using the following Sql server version. However the scripts will run on 2008 R2.

Microsoft SQL Server 2014 - 12.0.2254.0 (X64)
Jul 25 2014 18:52:51
Copyright (c) Microsoft Corporation
Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)
Shrink database causes heavy index fragmentation. when sql server try to free up space, it moves pages from end of the file to beginning of the file.



/*
Author: Hari Prasad Ere
Sub: Shrink DB file demo.
*/

--Create database for demo.
IF DATABASEPROPERTYEX (N'DbShrinkTest', N'Version') IS NOT NULL
    DROP DATABASE [DbShrinkTest];
GO
CREATE DATABASE DbShrinkTest;
GO
USE [DbShrinkTest];
GO

-- supress messages
SET NOCOUNT ON;
GO

 -- Create test tables
CREATE TABLE [Wine] (
    [id] INT IDENTITY,
    [Winetype] CHAR (8000) DEFAULT 'Malbec');
GO
-- Fill up the wine table by excuting go batch 3000 times
INSERT INTO [Wine] DEFAULT VALUES;
GO 3000


-- Create the product table
CREATE TABLE [ProdTable] (
    [ID] INT IDENTITY,
    [ProdName] CHAR (8000) );
CREATE CLUSTERED INDEX [prod_ID] ON [ProdTable] ([ID]);
GO

-- load product table from product adventure works database by excuting go batch 10 times
INSERT INTO [ProdTable] (prodname)
SELECT [Name] from AdventureWorks2014.Production.Product
GO 5

-- select * from wine
--select * from prodtable

-- Check the fragmentation of the production table
SELECT
    [avg_fragmentation_in_percent]
FROM sys.dm_db_index_physical_stats (
    DB_ID (N'DbShrinkTest'), OBJECT_ID (N'ProdTable'), 1, NULL, NULL);
GO
-- avg_fragmentation is returned on my server is 0.396825396825397.

-- Now lets start test drop the wine table and run shrink
DROP TABLE [Wine];
GO

-- Shrink the database
DBCC SHRINKDATABASE ([DbShrinkTest]);
(Note: shrink file also leads index fragmentation).
GO
Machine generated alternative text:
is Messages 
Dbld Fileld CumentSize 
Minimum Size 
used P ages 
Estim at ed Pages


-- Check the fragmentation of the production table
SELECT
    [avg_fragmentation_in_percent]
FROM sys.dm_db_index_physical_stats (
    DB_ID (N'DbShrinkTest'), OBJECT_ID (N'ProdTable'), 1, NULL, NULL);
GO
-- avg_fragmentation is returned on my server is 99.9206349206349.
Machine generated alternative text:
Resuğs Messages 
1 7ğğ206349206ü9 J


-- cleanup
IF DATABASEPROPERTYEX (N'DbShrinkTest', N'Version') IS NOT NULL
    DROP DATABASE [DbShrinkTest];
GO


Workarounds

  1. If your environment is required to shrink, then  REORGANIZE / REBUILD indexes after the SHRINKFILE/Shrink DB.

   2)    Add a new file group, Move user tables and indexes into new file group. Build indexes on new file group and shrink the file on old file group.