Connecting to Cloud SQL from Dataflow Job
Asked Answered
G

3

7

I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project.

I'm getting the following error:

java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)
  • According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam.gserviceaccount.com should have access to all resources within the same project if he's got "Editor" permissions.

  • When I run the same Dataflow job with DirectRunner everything works fine.

This is the code I'm using:

private static String JDBC_URL = "jdbc:mysql://myip:3306/mydb?verifyServerCertificate=false&useSSL=true";

PCollection < KV < String, Double >> exchangeRates = p.apply(JdbcIO. < KV < String, Double >> read()
 .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", JDBC_URL)
  .withUsername(JDBC_USER).withPassword(JDBC_PW))
 .withQuery(
  "SELECT CurrencyCode, ExchangeRate FROM mydb.mytable")
 .withCoder(KvCoder.of(StringUtf8Coder.of(), DoubleCoder.of()))
 .withRowMapper(new JdbcIO.RowMapper < KV < String, Double >> () {
  public KV < String, Double > mapRow(ResultSet resultSet) throws Exception {
   return KV.of(resultSet.getString(1), resultSet.getDouble(2));
  }
 }));

EDIT:

Using the following approach outside of beam within another dataflow job seems to work fine with DataflowRunner which tells me that the database might not be the problem.

java.sql.Connection connection = DriverManager.getConnection(JDBC_URL, JDBC_USER, JDBC_PW);
Gillie answered 22/6, 2017 at 12:38 Comment(0)
C
7

Following these instructions on how to connect to Cloud SQL from Java:

https://cloud.google.com/sql/docs/mysql/connect-external-app#java

I managed to make it work.

This is what the code looks like (you must replace MYDBNAME, MYSQLINSTANCE, USER and PASSWORD with your values.

Heads up: MYSQLINSTANCE format is project:zone:instancename.

And I'm using a custom class (Customer) to store the values for each row, instead of key-value pairs.

p.apply(JdbcIO. <Customer> read()
    .withDataSourceConfiguration(
        JdbcIO.DataSourceConfiguration.create(
            "com.mysql.jdbc.Driver", 
            "jdbc:mysql://google/MYDBNAME?cloudSqlInstance=MYSQLINSTANCE&socketFactory=com.google.cloud.sql.mysql.SocketFactory&user=USER&password=PASSWORD&useUnicode=true&characterEncoding=UTF-8"
        )
    )
    .withQuery( "SELECT CustomerId, Name, Location, Email FROM Customers" )
    .withCoder( AvroCoder.of(Customer.class) )
    .withRowMapper(
        new JdbcIO.RowMapper < Customer > ()
        {
            @Override
            public Customer mapRow(java.sql.ResultSet resultSet) throws Exception
            {
                final Logger LOG = LoggerFactory.getLogger(CloudSqlToBq.class);
                LOG.info(resultSet.getString(2));
                Customer customer = new Customer(resultSet.getInt(1), resultSet.getString(2), resultSet.getString(3), resultSet.getString(3));
                return customer;
            }
        }
    )
);

I hope this helps.

Cradle answered 1/5, 2018 at 10:56 Comment(1)
Just curious, how would the Customer class look like? Would it be something like a JavaBean class?Chauffer
R
3

Hi it worked for me in the way u did it.Additionaly i removed withusername and password methods from the db configuration method and my pipeline configurations looks like below

PCollection < KV <  Double, Double >> exchangeRates = p.apply(JdbcIO. < KV <  Double, Double >> read()
     .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", "jdbc:mysql://ip:3306/dbname?user=root&password=root&useUnicode=true&characterEncoding=UTF-8")
             )
     .withQuery(
      "SELECT PERIOD_YEAR, PERIOD_YEAR FROM SALE")
     .withCoder(KvCoder.of(DoubleCoder.of(), DoubleCoder.of()))
     .withRowMapper(new JdbcIO.RowMapper < KV < Double, Double >> () {
      @Override
       public KV<Double, Double> mapRow(java.sql.ResultSet resultSet) throws Exception {
         LOG.info(resultSet.getDouble(1)+ "Came");
          return KV.of(resultSet.getDouble(1), resultSet.getDouble(2));
      }
     }));

Hope this will help

Rigamarole answered 2/8, 2017 at 11:13 Comment(0)
D
2

I think this approach may work better, please try the com.mysql.jdbc.GoogleDriver, and use the maven dependencies listed here.

https://cloud.google.com/appengine/docs/standard/java/cloud-sql/#Java_Connect_to_your_database

Related question: Where i find and download this jar file com.mysql.jdbc.GoogleDriver?

Delamare answered 27/6, 2017 at 23:29 Comment(1)
Hey @alex-amato, unfortunately that doesn't seem to work with gcp dataflow as I get a "java.sql.SQLException: Cannot load JDBC driver class 'com.mysql.jdbc.GoogleDriver'" error even though both maven dependencies are added.Gillie

© 2022 - 2024 — McMap. All rights reserved.