Storing Large Amounts of Data in a DataTable

  • 14 years ago

    Hi All,
    I am having a problem storing some data within a data table. Basically what I am aiming to do is read data from a csv file into a datatable. The csv file in question are very large as they contain the output of our advertising stratergies downloaded directly from google ad words. They are in fact so large that Excel cannot load all rows.
    I use the following code to read the data in from the csv file and to return a DataTable

    private DataTable DeLimitFile(string FileName, char Delimiter, bool RowHeader)
            {
                string AltDelimiter = "We dont actually need this but I left it in for the sake of it ---- Mick";
                string FinalDelimiter;
                if (Delimiter == Convert.ToChar(9))
                {
                    FinalDelimiter = "\\t";
                }
                else
                {
                    FinalDelimiter = Delimiter.ToString();
                }
                bool HeaderFlag = RowHeader;
                string Pattern = "(?<=\")[^" + FinalDelimiter + "].*?[^" + FinalDelimiter + "](?=\")";
                System.Text.RegularExpressions.Regex Regex = new System.Text.RegularExpressions.Regex(Pattern);
                System.IO.StreamReader MyReader = new System.IO.StreamReader(FileName);
                DataTable MyTable = new DataTable();
                bool GetRowCount = true;
                while (MyReader.Peek() != -1)
                {
                    string MyLine = MyReader.ReadLine();
                    System.Text.RegularExpressions.MatchCollection MyMatches = Regex.Matches(MyLine);
                    foreach (System.Text.RegularExpressions.Match Match in MyMatches)
                    {
                        string NewString = Match.Value.Replace(FinalDelimiter, AltDelimiter);
                        MyLine = MyLine.Replace("\"" + Match.Value + "\"", NewString);
                    }
                    string[] Splits = System.Text.RegularExpressions.Regex.Split(MyLine, FinalDelimiter);
                    if (GetRowCount == true)
                    {
                        for (int I = 0; I <= Splits.GetLength(0) - 1; I++)
                        {
                            DataColumn MyColumn = new DataColumn();
                            if (HeaderFlag == true)
                            {
                                MyColumn.ColumnName = Splits[I];
                            }
                            MyTable.Columns.Add(MyColumn);
                        }
                        GetRowCount = false;
                    }
                    if (HeaderFlag == false)
                    {
                        for (int X = 0; X <= Splits.GetLength(0) - 1; X++)
                        {
                            Splits[X] = Splits[X].Replace(AltDelimiter, FinalDelimiter);
                        }
                        MyTable.Rows.Add(Splits);
                    }
                    else
                    {
                        HeaderFlag = false;
                    }
                }
                MyReader.Close();
                return MyTable;
            }

    However when I try to do anything with the returned data, I get the error "Input array is longer than the number of columns in the table"

    Does anyone have any idea how I can get around this error?

    Kind Regards




  • 14 years ago
    The native way to automatically imports data in CSV format is oto use MsSQL DTSes - it supports CSV  datasources
  • 14 years ago
    Thanks for the reply Michael, I will look into it.

Post a reply

Enter your message below

Sign in or Join us (it's free).

Contribute

Why not write for us? Or you could submit an event or a user group in your area. Alternatively just tell us what you think!

Our tools

We've got automatic conversion tools to convert C# to VB.NET, VB.NET to C#. Also you can compress javascript and compress css and generate sql connection strings.

“The generation of random numbers is too important to be left to chance.” - Robert R. Coveyou