Overview of ASQL
The Madics ASQL is a connection between a Microsoft SQL database and Madics and developed by the Ashell/ATE developer. Much like Madics in Access and DVerto the ASQL connects directly to your Madics data files and overnight, or via Madics record change triggers updates the SQL Database. But unlike Madics in Access and DVerto that both were written externally from the Madics application in Delphi the ASQL is is 100% pure Madics.
There are a variation of improvements in using ASQL over Madics in Access and DVerto, for example:
- The SQL connector is written and supported by the Ashell/ATE developer.
- The interface in written in same language (Ashell) the rest of Madics is.
- Table's and field name's can be tailor made to yourselves
- Benchmarks show as its a straight Madics (Ashell) to SQL its 80% faster than Madics in Access and 50% faster than DVerto.
- MadicsSQL also sends email Notifications after full table updates or if a SQL error occurs.
- File triggers can be enabled so when data is changed in Madics its is mirrored with in minutes to the SQL Table.
- Tables are copied/backed up from the previous x nights before updated over night.
- Madics CSV files usually used for Excel can be turned into SQL Tables for external reporting
- SQL Tables and field naming are more flexible unlike Madics in Access that is very strict and hard coded.
- SQL table 'refreshes' where the table/data is not deleted at the start, and records are just refreshed.
- Have multiple Madics SQL instances giving faster over night updates and triggers processed faster during the day.
- Ability to import non Madics SQL Table data into Madics. (Additional programming required)
- A-Shell/Linux supports MySQL, A-Shell/Windows in theory supports any database for which there is an ODBC 3.0 driver (aka “data source”)
- If your server is Linux we use a stand-alone A-Shell/Windows to connect your data to a Microsoft SQL Server.
- Supports Madics data files greater than 2GB in size, unlike Madics in Access and DVerto
- Its open to far more future development.
Please email [email protected] for prices.
Using ASQL
ASQL is usually installed with in new Windows/Ashell on a Windows PC, this connects to a SQL Database via ODBC and the Madics data files over a Samba network connection to Madics Linux server.
There is a front-end showing the configured tables / files along with the ability to configure over night schedules and data change triggers, plus the ability to do an instant update.
Columns:
- Scheduled update sequence
- Madics file ID
- Description
- SQL table
- SQL table primary key
- Scheduled full import
- Scheduled refresh only
- Scheduled bulk import
- Triggers enabled
- Number fields in table
- Number of records in Madics.
- Number of records in SQL table
- Append mode
- Number of records add on last update.
- Total duration of update of table
- Date and time the table was created.
- Last Trigger execution Date/Time
- Multi-thread server ID
- Update Sequence
This is the sequence the tables are updated over night, you can drag'n'drop the rows to re sequence.
- Full Import (Now)
Tables to update now (when F1 is pressed)
- Refresh Only (Now)
Tables to update now (when F1 is pressed)
- Bulk Import (Now)
Tables to update now (when F1 is pressed)
- Scheduled Full Import
Tables that are to be included in the over night scheduled full import.
- Scheduled Refresh Only
Tables that are to be included in the over night scheduled table refresh.
- Scheduled Bulk Import
Tables that are to be included in the over night scheduled bulk import.
- Triggers Enabled
Data file triggers, update the SQL table after data is changed in Madics.
- Append Mode
If append mode is enabled then only records added to the Madics data file since the last import will be added. If you require a full fresh update of this table then delete the SQL table in SQL Management Studios.
- Server ID
Denotes what multi-thread server to use if more than one is enabled.
If multiple licences are purchased it is possible to run the Madics SQLUPD server multiple times and have them running all together over night speeding up the update-process, and during the day monitoring the file hook changes.
Full Import vs Refresh vs Bulk Import
- The Full Import will delete the Table and recreate all the fields and re-populate all the data each time.
- The RefreshOnly will not delete the table and updates any existing records and append new records, at the end of the update it will delete any records are existed but were not updated as theses would of been deemed as deleted in Madics. The benefit of running it in this mode is it can be ran during the day and not effecting any other application also querying the table.
- This method will be a little slower than a "full import" as it looks up the records to see if they exits first and if not inserts them where the full import just inserts the records as the table would of been cleared at the start.
- Please note if fields are add/deleted to the table via the .DVT file they also need to be manually added to the SQL table or a Full Import ran to refresh the table.
- BulkImport - The Full Refresh Import and Refresh Only updates the SQL database Madics record by record where the Bulk Import reads all the Madics records upfront creating dozens of SQL script files, theses are then all executed at once at the end. Each script file created are batched by default at 100 records at a time, but this can be changed in the configuration settings, Under several situation we have found this quicker than the one-by-one record approach but not always.
Example snippet from a log file updating a Stock Movement Log Table.
15/03/19 14:59:11 STEVE - 1: ** Table Update for Stock Movement Log **
15/03/19 14:59:11 STEVE - 1: Delete Table Stock_Movements
15/03/19 14:59:11 STEVE - 1: Create Table Stock_Movements for SMVLOG
15/03/19 14:59:11 STEVE - 1: Copy SMVLOG.DAT to C:\TemporarySQLdata\SMVLOG.DAT
15/03/19 14:59:17 STEVE - 1: Copy Successful
15/03/19 14:59:17 STEVE - 1: SEARCH for 160104, KeyPos: 33,EndPos: 38, Len: 6
15/03/19 14:59:17 STEVE - 1: KEY 160104 Found, Starting from record 2315071
15/03/19 14:59:17 STEVE - 1: Reading 4115346 records of 6430416 from SMVLOG.DAT
15/03/19 15:13:47 STEVE - 1: Completed 10% (411535 records)
15/03/19 15:27:51 STEVE - 1: Completed 20% (823070 records)
15/03/19 15:41:46 STEVE - 1: Completed 30% (1234605 records)
15/03/19 15:56:54 STEVE - 1: Completed 40% (1646140 records)
15/03/19 16:11:09 STEVE - 1: Completed 50% (2057675 records)
15/03/19 16:25:31 STEVE - 1: Completed 60% (2469210 records)
15/03/19 16:39:55 STEVE - 1: Completed 70% (2880745 records)
15/03/19 16:54:16 STEVE - 1: Completed 80% (3292280 records)
15/03/19 17:08:37 STEVE - 1: Completed 90% (3703815 records)
15/03/19 17:22:34 STEVE - 1: COMPLETED table update for SMVLOG
15/03/19 17:22:34 STEVE - 1: 4115346 records imported in 143 minutes
15/03/19 17:22:34 STEVE - 1: Updated 28779 records per minute
15/03/19 17:22:34 STEVE - 1: 0 record errors, 0 records filtered out
Set-up
- Setup Samba (if Linux)
Recommend a read-only Samba to Madics data files
Plus read/write Samba access to /madics/miame/dsk0/280999
- Install Ashell/Madics
Extract latest application imagine into c:\Madics\maSQL folder on Windows PC (Source M:\CD-Images\ASQL)
Create a Desktop Shortcut. C:\Madics\maSQL\miame\bin\ashw32.exe -n -i "C:\MADICS\maSQL\miame\miame.ini"
License Ashell (LICENS)
- Set-up file miame paths. (C:\Madics\maSQL\miame\miame.ini)
In this example N:\ is a mapped drive to the samba read/write 280999 Linux directory and P:\ is a mapped drive to the samba read-only Madics data on the Linux system.
DEVICE=DSK0:[1,2] C:\madics\maSQL\miame\DSK0\001002\
DEVICE=DSK0:[2,2] C:\madics\maSQL\miame\DSK0\002002\
DEVICE=DSK0:[1,4] C:\madics\maSQL\miame\DSK0\001004\
DEVICE=DSK0:[7,6] C:\madics\maSQL\miame\DSK0\007006\
DEVICE=DSK0:[280,999] N:\
DEVICE=DSK0 P:\miame\dsk0\
or a full network path can be used, for example:
DEVICE=DSK0:[1,2] C:\madics\maSQL\miame\DSK0\001002\
DEVICE=DSK0:[2,2] C:\madics\maSQL\miame\DSK0\002002\
DEVICE=DSK0:[1,4] C:\madics\maSQL\miame\DSK0\001004\
DEVICE=DSK0:[7,6] C:\madics\maSQL\miame\DSK0\007006\
DEVICE=DSK0 \\10.102.100.119\madics\miame\dsk0\
DEVICE=DSK2 \\10.102.100.119\madics2\miame\dsk2\
DEVICE=DSK3 \\10.102.100.119\madics2\miame\dsk3\
DEVICE=DSK5 \\10.102.100.119\madics2\miame\dsk5\
DEVICE=DSK7 \\10.102.100.119\madics2\miame\dsk7\
- MaSQL File Definitions
Update the Madics Linux Server with the latest MaSQL File Definitions from [280,25] and configure required tables.
- Set-up ODBC
Launch the Windows ODBC and create a maSQL to a Microsoft SQL Database.
- Set-up Madics SQLUPD settings file. (SQLUPD.INI in [241,1])
[SETTINGS]
STORE SQLUPD.DAT LOCALLY=Y
FILE READ ONLY MODE=Y
DATLOG NAME=SQLUPD
DATLOG DISK=DSK0:
DATLOG PPN=[280,999]
EXIT CHAIN TO MENU=N
CLOSE ON EXIT=N
TABLES TO LOG=Y
MAX ERRORS BEFORE ABORT=10
MAX TABLE BACKUP COPIES=7
IDLE WAIT SECONDS=20
SCHEDULED START TIME=02:00
SERVER STOP HOUR=22
SERVER START HOUR=5
DEBUG MODE=N
UNIX CLIENT ONLY=Y
CATCH ERRORS TO TABLE=Y
COPY DATA FILE LOCALLY=Y
FILE DEFINITIONS PPN=DSK0:[280,25]
HOOKS ENABLED=Y
HOOK FILE=DSK0:MAS001.LOG[280,999]
MINIMUM HOOK LOGGING=Y
DVERTO_ID FIELD=N
TEXT FILE TYPE=2
SQL COMMANDS TO FILE=N
REFRESH MODE ENABLED=Y
BULK MODE ENABLED=Y
BULK BATCH MAX=120
[BACKUP TABLES]
EVERY TABLE=Y
;;SMVLOG=Y or just list selected tables if dont want everything.
;;INTAKE=Y
;;HISSAL=Y
[Connection]
Host=maSQL
DBMSID=2
Schema=
PW=
User=
MaSQL File Definitions and Configurations
ASQL has been written trying to stay as compatible to DVerto as possible using very similar Madics file definitions, each Madics data file contains two file's a .DVT that holds information about the Madics data file and the SQL table name, the 2nd file is a .LAY and contains the actual Madics file structure and fields names.
A simple example of this is the Madics Sales Rep File:
SALMAN.DVT
[SETTINGS]
TABLENAME=Sales_Reps
DESCR=Sales Reps
MAPFILE=SALMAN.LAY
SQLUPD DATAFILE=DSK43:SALMAN.DAT[241,XXX]
RECORDSIZE=64
[COMPANION DATA FILES]
HISSA2,128,DSK0:HISSA2.DAT[241,XXX]
1. The XXX in the data file path is automatically replaced with the company ppn its running from.
2. The companion files are only required if a SEARCH or Parallel Record lookup is done in the .LAY file. (see further on)
SALMAN.LAY
1,Rep_Number<PK>,S,3
4,Name,S,26
If a field contains a <PK> in the field name then this is used as the tables Primary Key rather than the default of the Madics Record Number.
Speeding up transactional data import
For transactional data files like Sales History, Stock Movement, Sales Intake etc there is an ability to search for the first record to import rather than rely on ASQL to read from the very first record one by one to get to the start of required data records.
For example there are 8 million records in your stock movement log in Madics but you only require the last 2 years say about 2 million records in ASQL, So instead of a full refresh starting at record one and importing nothing and wasting time until it gets to the 6th million record, you can search straight for the required starting record, this can save multiple hours.
To do this set the following section in the .DVT file:
[SEARCH]
POS=38
LEN=6
KEY=170302
Set the POS= to the field location in the Madics data file, LEN= is the field length, and KEY= being the search key usually a date in a YYMMDD format.
If no record is found they it will start from record one, so if using this make sure the date used is not a weekend etc so a record exists.
Madics ASQL All Licenses are currently in use message See here.
The .LAY file Parameters
Standard Fields
Column 1 | Column 2 | Column 3 |
Position the field start in the actual Madics data file. | Data type: S=String, F=Float, B=Binary, D=Date | The length of the field in Madics. |
Example:
1,Rep_Number,S,3
Numeric Decimal Placing
Column 1 | Column 2 | Column 3 | Column 4 |
Position the field start in the actual Madics data file. | Data type: F=Float, B=Binary | The length of the field in Madics. | QTY, COST, PRICE, TOTDIV , 100, 1000, 10000 etc |
Example:
265,Latest_Cost,F,6,COST
175,Base_Price,F,6,PRICE
String Masking
Column 1 | Column 2 | Column 3 | Column 4 |
Position the field start in the actual Madics data file. | Data type: S=String | The length of the field in Madics. | MASK=String Mask |
Example:
1,Works_Order_No,S,10,MASK=######/##/##
Filtering Data
Column 1 | Column 2 | Column 3 | Column 4 |
FILTER | Table Field Name | Filter type: = <> < > =< => | Filter String {blank} is a keyword for a blank string. |
Example:
1,Product_Group,S,15
16,Location,S,2
FILTER,Location,=,{blank}
18,Part_Number<PK>,S,15
or
FILTER,Invoice_Date,=>,04/01/2018
Madics Text Files (eg. Stock Sales and Tech Specs)
Column 1 | Column 2 | Column 3 | Column 4 |
Position the field start in the actual Madics data file. | Data type: M=varchar(max) | The length of the field in Madics. | The file name to import into the table field. |
Column 4 Parameters
Instead of entering the actual full text file you can replace certain part of the spec with data from the Madics files, for example the Stock Tech Spec number.
DSK0:XXXXXX.TSP[241|PPP]
XXXXXX is replaced with the data from Madics using Column 1 and Column 3. (For example the Spec Number)
PPP is replaces with the company ppn, for example PPP becomes 001 (Live company,) 016 (Training etc)
Example:
326,Spec_No,S,6
FILTER,Spec_No,<>,{blank}
326,TSP_Spec,M,6,DSK0:XXXXXX.TSP[241|PPP]
326,SSP_Spec,M,6,DSK0:XXXXXX.SSP[241|PPP]
326,ISP_Spec,M,6,DSK0:XXXXXX.ISP[241|PPP]
326,PSP_Spec,M,6,DSK0:XXXXXX.PSP[241|PPP]
Fixed Static Field
Column 1 | Column 2 | Column 3 | Column 4 | Column 5 | Column 6 | Column 7 |
0 (Zero) | Table Field Name | Data type: S=String, F=Float, B=Binary, D=Date | Field length | Blank column | FIELDONLY | String/Data |
Example:
0,TestDate,D,6,,FIELDONLY,2018-08-24
Running numeric fields
Column 1 | Column 2 | Column 3 | Column 4 |
0 (Zero) | Table Field Name | A=Auto increment numeric | Last Number/Starting Number. (eg of 100 then it starts from 101) |
Example:
0,RunningNumber,A,100
Parallel Record Lookup
Column 1 | Column 2 | Column 3 | Column 4 | Column 5 | Column 6 | Column 7 | Column 8 | Column 9 | Column 10 | Column 11 |
0 (Zero) | Table Field Name | Data type | Field length | Blank column | PARALLEL | Madics Data File | Field start position | data type | length | Decimal Places QTY, COST, PRICE etc |
Example:
0,Customer_Unique_ID<PK>,F,6,,PARALLEL,SLCUS2,595,F,6
SLCUS2 needs would also need to be define as a companion file in SLCUST.DVT
Search Record Lookup
Column 1 | Column 2 | Column 3 | Column 4 | Column 5 | Column 6 | Column 7 | Column 8 | Column 9 | Column 10 | Column 11 | Column 12 | Column 13 | Column 14 |
0 (Zero) | Table Field Name | Data type | Field length | Blank column | SEARCH or SEQUENTIALSEARCH | Madics Data File | Field start position | data type | length | Key Field | Key Position | Key Length | Decimal Places QTY, COST, PRICE etc |
Example:
441,External_Sales_Rep,S,3
0,Rep_Name,S,26,,SEARCH,SALMAN,4,S,26,External_Sales_Rep,1,3
SALMAN needs would also need to be define as a companion file in SLCUST.DVT
SEARCH - This does a binary search on a sorted Madics file.
SEQUENTIALSEARCH - This does a slower sequential search on a non index field.
Special Types: - STERLING
A STERLING type allows a computed field to be calculated from a value and exchange rate. so for example to calculate the Sales History Sterling Amount:
80,Net_Line_Value,F,6,TOTDIV
120,Currency_Code,S,3
123,Exchange_Rate,F,6
0,Sterling_Line_Value,F,6,,STERLING,Net_Line_Value,Exchange_Rate,Currency_Code
Note: If 0 (zero) is passed instead of an Exchange_Rate field then ASQL will attempt to use the latest exchange rate on the currency table.
Special Types: - ORDCRD
The ORDCRD command will turn a credit note type values to a negative.
This special commands field uses Columns number 6 (command) and 7 (field name)
For example:
50,Qty_Invoiced,F,6,QTY,ORDCRD,Order_Or_Credit
56,Cost,F,6,COST,ORDCRD,Order_Or_Credit
It also work for parallel record reads but in columns 12 & 13.
0,His_Latest_Cost,F,6,,PARALLEL,HISSA2,7,F,6,COST,ORDCRD,Order_Or_Credit
Special Types: - ORDAUD_FIELD_DESCR
When reading ORDAUD (Sales order audit) this speical computed type will set the Field Description.
This special commands field uses Columns number 6 (COMPUTE), 7 (ORDAUD) and column 8 as ORDAUD_FIELD_DESCR
For example:
0,Audit_Field_Description,S,40,,COMPUTE,ORDAUD,ORDAUD_FIELD_DESCR
56,Cost,F,6,COST,ORDCRD,Order_Or_Credit
Computed Fields
For Sales Order lines (ORDLIN) set a currency code.
To set a currency code on a sales order line use the following COMPUTE tag, this will look up the sales order header and return and set the currency code on the line. for example:
0,Currency_Code,S,3,,COMPUTE,ORDHD2-CURRENCY
Get the latest currency exchange rate
To set a field to the latest exchange rate taken from the Madics currency table is the COMPUTE tag of EXCHANGE-RATE along with passing the currency code, for example:
0,Exchange_Rate,F,6,,COMPUTE,EXCHANGE-RATE,Currency_Code
Special Madics File/Table Types - Selected record numbers only
Set SPECIAL=SPECIAL=RECORDS,{from record} , {to record} in the .DVT file
For example, to create a table with the system hold descriptions
[SETTINGS]
TABLENAME=Settings5
DESCR=Setting Record 5
MAPFILE=OESYS5.LAY
SQLUPD DATAFILE=DSK0:OESYS.DAT[241,XXX]
RECORDSIZE=256
SPECIAL=RECORDS,5,5
With the OESYS5.LAY containing:
1,HoldDescription1,S,14
15,HoldDescription2,S,14
29,HoldDescription3,S,14
43,HoldDescription4,S,14
57,HoldDescription5,S,14
71,HoldDescription6,S,14
85,HoldDescription7,S,14
99,HoldDescription8,S,14
Special Madics File/Table Types - for Sales Order Log
Sales Order Log - To create multiple SQL despatch records rather than 8 to a record.
Set SPECIAL=ORDLOG in the ORDLOG.DVT
For example:
[SETTINGS]
TABLENAME=Sales_Order_log_Live
DESCR=Sales Order Lines Log - Live
MAPFILE=ORDLOG.LAY
SQLUPD DATAFILE=DSK0:ORDLOG.DAT[241,XXX]
RECORDSIZE=512
HEADER RECORD=DSK0:ORDLIN.DAT[241,XXX]
SPECIAL=ORDLOG
[COMPANION DATA FILES]
ORDLIN,512,DSK0:ORDLIN.DAT[241,XXX]
With a ORDLOG.LAY file containing:
0,OrdCredKey,S,7,,JOIN,Order_Number,Order_Or_Credit
1,Order_Number,S,6
7,Order_Or_Credit,S,1
0,Despatch_Qty,F,6
0,Despatch_Note,S,6
0,Despatch_Date,D,6
0,Despatch_Flag,S,1
0,Despatch_Ref,S,10
0,Invoice_Number,S,6
0,Invoice_Date,D,6
0,Des_RecNo_Key<PK>,S,20
0,Unique_Line_Seq_No,B,3,,PARALLEL,ORDLIN,150,B,3
Complex Sales History definition example with a mixture of types and commands.
8,Product_Group,S,15
23,Part_Number,S,15
38,Invoice_Date,D,6
FILTER,Invoice_Date,>,31/05/2018
44,Customer_Code,S,6
120,Currency_Code,S,3
123,Exchange_Rate,F,6
50,Qty_Invoiced,F,6,QTY,ORDCRD,Order_Or_Credit
56,Cost,F,6,COST,ORDCRD,Order_Or_Credit
62,Price,F,6,TOTDIV,ORDCRD,Order_Or_Credit
68,Discount,F,6,TOTDIV,ORDCRD,Order_Or_Credit
74,Internal_Rep,S,6
80,Net_Line_Value,F,6,TOTDIV,ORDCRD,Order_Or_Credit
0,Sterling_Line_Value,F,6,,STERLING,Net_Line_Value,Exchange_Rate,Currency_Code
86,VAT,F,6,TOTDIV,ORDCRD,Order_Or_Credit
92,Branch,S,10
104,External_Rep,S,3
107,Area,S,3
110,Stock_location,S,2
112,Order_Number,S,6
118,Product_Cat,S,2
0,His_Latest_Cost,F,6,,PARALLEL,HISSA2,7,F,6,COST,ORDCRD,Order_Or_Credit
0,His_Source,S,1,,PARALLEL,HISSA2,44,B,1
0,His_Unique_ID<PK>,F,6,,PARALLEL,HISSA2,54,F,6
Complex Sales Order Line definition example with a mixture of types and commands.
1,Order_Number,S,6
7,Product_Group,S,15
22,Location,S,2
24,Part_Number,S,15
FILTER,Part_Number,<>,*
FILTER,Part_Number,<>,**
39,Description,S,40
89,Qty_Ordered,F,6,QTY
95,Qty_Allocated,F,6,QTY
101,Qty_Invoiced,F,6,QTY
107,Net_Price,F,6,PRICE
113,Gross_Price,F,6,PRICE
131,Discount,F,6,PRICE
137,Line_Status,S,1
138,BIN_Location,S,6
150,Unique_Line_Seq_No,B,3
153,Line_Number,B,2
202,Required_Date,D,6
208,Acknowledged_Date,D,6
0,Currency_Code,S,3,,COMPUTE,ORDHD2-CURRENCY
0,Exchange_Rate,F,6,,COMPUTE,EXCHANGE-RATE,Currency_Code
0,Sterling_Net_Price,F,6,,STERLING,Net_Price,Exchange_Rate,Currency_Code
MadicsSQL and importing CSV files.
If enabled MadicsSQL will scan a predefined directory for .CSI files (CSV files with different extension name) if one is found and a matching file name corresponding definition file with an extension of .CSD exists in [280,25] then MadicsSQL will use this and import the contents of the CSV into a SQL Table, The CSI file is then renamed a .OK (or a .OPS if it failed)
The following settings need to be enabled in SQLUPD.INI
CSV TO SQL IMPORTS=Y
CSV DIRECTORY=N:\Madics\miame\dsk0\280999 (Samba path to LINUX 280999 directory)
CSV SUFFIX=CSI
There needs to be a .CSD definition file in [280,25] that contains what table to create and the field layouts, for example:
[SETTINGS]
TABLE NAME=SOP029_Stock_Valuation
DESCRIPTION=SOP29 CSV Export
IGNORE FIRST LINE=Y
BACKUP TABLE=Y
EMAIL NOTIFICATION=Y
EMAIL RECEIPT=
APPEND MODE=N
;PRIMARY KEY=
[FIELDS]
1=Product_Group,varchar(15)
2=Part_Number,varchar(15)
3=Description,varchar(40)
4=Category,varchar(2)
5=Location,varchar(2)
6=Latest_Cost,FLOAT
7=Free_JR,FLOAT
8=Allocated,FLOAT
9=Physical,FLOAT
10=Total,FLOAT
11=ReportDate,DATE
12=ReportUser,varchar(6)