How to read CSV file and insert data into PostgreSQL using Mule ESB, Mule Studio
Asked Answered
V

3

12

I am very new to Mule Studio.

I am facing a problem. I have a requirement where I need to insert data from a CSV file to PostgreSQL Database using Mule Studio.

I am using Mule Studio CE (version: 1.3.1). I check ed in the Google and find that we can use Data-mapper for doing so. But it works only for EE .So I cannot use it.

Also I am checking in the net and found an article Using Mule Studio to read Data from PostgreSQL(Inbound) and write it to File (Outbound) - Step by Step approach.

That seems feasible but my requirement is just the opposite of the article given. I need File as Inbound data while Databse as Outbound component.

What is the way to do so?

Any step by step help (like what components to use) and guidance will be greatly appreciated.

Vardon answered 17/12, 2012 at 9:19 Comment(0)
Z
11

Here is an example that inserts a two columns CSV file:

<configuration>
    <expression-language autoResolveVariables="true">
        <import class="org.mule.util.StringUtils" />
        <import class="org.mule.util.ArrayUtils" />
    </expression-language>
</configuration>

<spring:beans>
    <spring:bean id="jdbcDataSource" class=" ... your data source ... " />
</spring:beans>

<jdbc:connector name="jdbcConnector" dataSource-ref="jdbcDataSource">
    <jdbc:query key="insertRow"
        value="insert into my_table(col1, col2) values(#[message.payload[0]],#[message.payload[1]])" />
</jdbc:connector>

<flow name="csvFileToDatabase">
    <file:inbound-endpoint path="/tmp/mule/inbox"
        pollingFrequency="5000" moveToDirectory="/tmp/mule/processed">
         <file:filename-wildcard-filter pattern="*.csv" />
    </file:inbound-endpoint>

    <!-- Load all file in RAM - won't work for big files! -->
    <file:file-to-string-transformer />
    <!-- Split each row, dropping the first one (header) -->
    <splitter
        expression="#[rows=StringUtils.split(message.payload, '\n\r');ArrayUtils.subarray(rows,1,rows.size())]" />
    <!-- Transform CSV row in array -->
    <expression-transformer expression="#[StringUtils.split(message.payload, ',')]" />
    <jdbc:outbound-endpoint queryKey="insertRow" />
</flow>
Zambia answered 18/12, 2012 at 0:8 Comment(3)
Hi, this worked.I just wanted to ask 2 more questions. a)How can I validate that a valid .csv file is being parsed (only extension checking is ok) and then proceed with next steps. (b) How to filter header rows?Vardon
Reviewed my answer to add a filename-wildcard-filter to only pick *.csv files, to drop the first row of the split array (header) and to use file-to-string-transformer (more correct than object-to-string-transformer).Zambia
Can someone give directions on how to validate the CSV . I mean to check if any of the data is not null and so on.Influential
B
0

In order to read CSV file and insert data into PostgreSQL using Mule all you need to follow following steps: You need to have following things as pre-requisite

  • PostgreSQL
  • PostgreSQL JDBC driver
  • Anypoint Studio IDE and
  • A database to be created in PostgreSQL

Then configure Postgre SQL JDBC Driver in Global Element Properties inside Studio Create Mule Flow in Anypoint Studio as follows:

  • Step 1: Wrap CSV file source in File component
  • Step 2: Convert between object arrays and strings
  • Step 3: Split each row
  • Step 4: Transform CSV row in array
  • Step 5: Dump into the destination Database
Breann answered 30/1, 2018 at 19:51 Comment(0)
P
-1

I would like to suggest Dataweave.

Steps

  1. read the file using FTP connector / endpoint.

  2. Transform using Data weave.

  3. Use database connector , store the data in DB.

Populous answered 7/6, 2016 at 4:51 Comment(1)
No. The OP explicitly wants a CE only solution.Zambia

© 2022 - 2024 — McMap. All rights reserved.