Come configurare in modo java origini dati separate per dati batch di spring e dati aziendali? Dovrei farlo anche io?

il mio lavoro principale è quello di leggere solo le operazioni e l’altro fa un po ‘di scrittura ma sul motore MyISAM che ignora le transazioni, quindi non richiederei necessariamente il supporto delle transazioni … Come posso configurare Spring Batch per avere la propria origine dati per il JobRepository, separata da chi detiene i dati aziendali? Le configurazioni iniziali dell’origine dati vengono eseguite come segue:

@Configuration public class StandaloneInfrastructureConfiguration { @Autowired Environment env; @Bean public LocalContainerEntityManagerFactoryBean entityManagerFactory() { LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean(); em.setDataSource(dataSource()); em.setPackagesToScan(new String[] { "org.podcastpedia.batch.*" }); JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter(); em.setJpaVendorAdapter(vendorAdapter); em.setJpaProperties(additionalJpaProperties()); return em; } Properties additionalJpaProperties() { Properties properties = new Properties(); properties.setProperty("hibernate.hbm2ddl.auto", "none"); properties.setProperty("hibernate.dialect", "org.hibernate.dialect.MySQL5Dialect"); properties.setProperty("hibernate.show_sql", "true"); return properties; } @Bean public DataSource dataSource(){ return DataSourceBuilder.create() .url(env.getProperty("db.url")) .driverClassName(env.getProperty("db.driver")) .username(env.getProperty("db.username")) .password(env.getProperty("db.password")) .build(); } @Bean public PlatformTransactionManager transactionManager(EntityManagerFactory emf){ JpaTransactionManager transactionManager = new JpaTransactionManager(); transactionManager.setEntityManagerFactory(emf); return transactionManager; } } 

e quindi viene importato nella class di configurazione di Job in cui l’annotazione @EnableBatchProcessing avvale automagicamente di essa. Il mio primo pensiero è stato quello di provare ad impostare la class di configurazione estendere DefaultBatchConfigurer , ma poi ottengo BeanCurrentlyInCreationException ( org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name jobBuilders: Requested bean is currently in creation: Is there an unresolvable circular reference? ):

 @Configuration @EnableBatchProcessing @Import({StandaloneInfrastructureConfiguration.class, NotifySubscribersServicesConfiguration.class}) public class NotifySubscribersJobConfiguration extends DefaultBatchConfigurer { @Autowired private JobBuilderFactory jobBuilders; @Autowired private StepBuilderFactory stepBuilders; @Autowired private DataSource dataSource; @Autowired Environment env; @Override @Autowired public void setDataSource(javax.sql.DataSource dataSource) { super.setDataSource(batchDataSource()); } private DataSource batchDataSource(){ return DataSourceBuilder.create() .url(env.getProperty("batchdb.url")) .driverClassName(env.getProperty("batchdb.driver")) .username(env.getProperty("batchdb.username")) .password(env.getProperty("batchdb.password")) .build(); } @Bean public ItemReader notifySubscribersReader(){ JdbcCursorItemReader reader = new JdbcCursorItemReader(); String sql = "select * from users where is_email_subscriber is not null"; reader.setSql(sql); reader.setDataSource(dataSource); reader.setRowMapper(rowMapper()); return reader; } ........ } 

Ogni pensiero è più che benvenuto. Il progetto è disponibile su GitHub – https://github.com/podcastpedia/podcastpedia-batch

Grazie mille.

Ok, questo è strano ma funziona. Lo spostamento delle origini dati nella propria class di configurazione funziona bene e uno è in grado di autorizzare.

L’esempio è una versione multi-datasource di Spring Batch Service Example :

DataSourceConfiguration :

 public class DataSourceConfiguration { @Value("classpath:schema-mysql.sql") private Resource schemaScript; @Bean @Primary public DataSource hsqldbDataSource() throws SQLException { final SimpleDriverDataSource dataSource = new SimpleDriverDataSource(); dataSource.setDriver(new org.hsqldb.jdbcDriver()); dataSource.setUrl("jdbc:hsqldb:mem:mydb"); dataSource.setUsername("sa"); dataSource.setPassword(""); return dataSource; } @Bean public JdbcTemplate jdbcTemplate(final DataSource dataSource) { return new JdbcTemplate(dataSource); } @Bean public DataSource mysqlDataSource() throws SQLException { final SimpleDriverDataSource dataSource = new SimpleDriverDataSource(); dataSource.setDriver(new com.mysql.jdbc.Driver()); dataSource.setUrl("jdbc:mysql://localhost/spring_batch_example"); dataSource.setUsername("test"); dataSource.setPassword("test"); DatabasePopulatorUtils.execute(databasePopulator(), dataSource); return dataSource; } @Bean public JdbcTemplate mysqlJdbcTemplate(@Qualifier("mysqlDataSource") final DataSource dataSource) { return new JdbcTemplate(dataSource); } private DatabasePopulator databasePopulator() { final ResourceDatabasePopulator populator = new ResourceDatabasePopulator(); populator.addScript(schemaScript); return populator; } } 

BatchConfiguration :

 @Configuration @EnableBatchProcessing @Import({ DataSourceConfiguration.class, MBeanExporterConfig.class }) public class BatchConfiguration { @Autowired private JobBuilderFactory jobs; @Autowired private StepBuilderFactory steps; @Bean public ItemReader reader() { final FlatFileItemReader reader = new FlatFileItemReader(); reader.setResource(new ClassPathResource("sample-data.csv")); reader.setLineMapper(new DefaultLineMapper() { { setLineTokenizer(new DelimitedLineTokenizer() { { setNames(new String[] { "firstName", "lastName" }); } }); setFieldSetMapper(new BeanWrapperFieldSetMapper() { { setTargetType(Person.class); } }); } }); return reader; } @Bean public ItemProcessor processor() { return new PersonItemProcessor(); } @Bean public ItemWriter writer(@Qualifier("mysqlDataSource") final DataSource dataSource) { final JdbcBatchItemWriter writer = new JdbcBatchItemWriter(); writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider()); writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)"); writer.setDataSource(dataSource); return writer; } @Bean public Job importUserJob(final Step s1) { return jobs.get("importUserJob").incrementer(new RunIdIncrementer()).flow(s1).end().build(); } @Bean public Step step1(final ItemReader reader, final ItemWriter writer, final ItemProcessor processor) { return steps.get("step1") . chunk(1) .reader(reader) .processor(processor) .writer(writer) .build(); } } 

Ho le mie origini dati in una class di configurazione separata. Nella configurazione batch estendere DefaultBatchConfigurer e sovrascrivere il metodo setDataSource , passando il database specifico da utilizzare con Spring Batch con un @Qualifier. Non ero in grado di farlo funzionare usando la versione del costruttore, ma il metodo setter ha funzionato per me.

Il mio Reader, Processor e Writer sono nelle loro classi autonome, insieme ai passaggi.

Questo sta usando Spring Boot 1.1.8 e Spring Batch 3.0.1. Nota: Avevamo un setup diverso per un progetto usando Spring Boot 1.1.5 che non funzionava allo stesso modo nella versione più recente.

 package org.sample.config.jdbc; import javax.sql.DataSource; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Primary; import org.springframework.core.env.Environment; import com.atomikos.jdbc.AtomikosDataSourceBean; import com.mysql.jdbc.jdbc2.optional.MysqlXADataSource; /** * The Class DataSourceConfiguration. * */ @Configuration public class DataSourceConfig { private final static Logger log = LoggerFactory.getLogger(DataSourceConfig.class); @Autowired private Environment env; /** * Siphon data source. * * @return the data source */ @Bean(name = "mainDataSource") @Primary public DataSource mainDataSource() { final String user = this.env.getProperty("db.main.username"); final String password = this.env.getProperty("db.main.password"); final String url = this.env.getProperty("db.main.url"); return this.getMysqlXADataSource(url, user, password); } /** * Batch data source. * * @return the data source */ @Bean(name = "batchDataSource", initMethod = "init", destroyMethod = "close") public DataSource batchDataSource() { final String user = this.env.getProperty("db.batch.username"); final String password = this.env.getProperty("db.batch.password"); final String url = this.env.getProperty("db.batch.url"); return this.getAtomikosDataSource("metaDataSource", this.getMysqlXADataSource(url, user, password)); } /** * Gets the mysql xa data source. * * @param url the url * @param user the user * @param password the password * @return the mysql xa data source */ private MysqlXADataSource getMysqlXADataSource(final String url, final String user, final String password) { final MysqlXADataSource mysql = new MysqlXADataSource(); mysql.setUser(user); mysql.setPassword(password); mysql.setUrl(url); mysql.setPinGlobalTxToPhysicalConnection(true); return mysql; } /** * Gets the atomikos data source. * * @param resourceName the resource name * @param xaDataSource the xa data source * @return the atomikos data source */ private AtomikosDataSourceBean getAtomikosDataSource(final String resourceName, final MysqlXADataSource xaDataSource) { final AtomikosDataSourceBean atomikos = new AtomikosDataSourceBean(); atomikos.setUniqueResourceName(resourceName); atomikos.setXaDataSource(xaDataSource); atomikos.setMaxLifetime(3600); atomikos.setMinPoolSize(2); atomikos.setMaxPoolSize(10); return atomikos; } } package org.sample.settlement.batch; import javax.sql.DataSource; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.batch.core.Job; import org.springframework.batch.core.Step; import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer; import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing; import org.springframework.batch.core.configuration.annotation.JobBuilderFactory; import org.springframework.batch.core.configuration.annotation.StepBuilderFactory; import org.springframework.batch.core.launch.support.RunIdIncrementer; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.transaction.PlatformTransactionManager; /** * The Class BatchConfiguration. * */ @Configuration @EnableBatchProcessing public class BatchConfiguration extends DefaultBatchConfigurer { private final static Logger log = LoggerFactory.getLogger(BatchConfiguration.class); @Autowired private JobBuilderFactory jobs; @Autowired private StepBuilderFactory steps; @Autowired private PlatformTransactionManager transactionManager; @Autowired @Qualifier("processStep") private Step processStep; /** * Process payments job. * * @return the job */ @Bean(name = "processJob") public Job processJob() { return this.jobs.get("processJob") .incrementer(new RunIdIncrementer()) .start(processStep) .build(); } @Override @Autowired public void setDataSource(@Qualifier("batchDataSource") DataSource batchDataSource) { super.setDataSource(batchDataSource); } } 

Hai già provato qualcosa del genere?

 @Bean(name="batchDataSource") public DataSource batchDataSource(){ return DataSourceBuilder.create() .url(env.getProperty("batchdb.url")) .driverClassName(env.getProperty("batchdb.driver")) .username(env.getProperty("batchdb.username")) .password(env.getProperty("batchdb.password")) .build(); } 

quindi contrassegnare l’altra origine dati con @Primary e utilizzare un @Qualifier nella configurazione batch per specificare che si desidera aggiornare il bean batchDataSource.

Per https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto-two-datasources :

 @Bean @Primary @ConfigurationProperties("app.datasource.first") public DataSourceProperties firstDataSourceProperties() { return new DataSourceProperties(); } @Bean @Primary @ConfigurationProperties("app.datasource.first") public DataSource firstDataSource() { return firstDataSourceProperties().initializeDataSourceBuilder().build(); } @Bean @ConfigurationProperties("app.datasource.second") public DataSourceProperties secondDataSourceProperties() { return new DataSourceProperties(); } @Bean @ConfigurationProperties("app.datasource.second") public DataSource secondDataSource() { return secondDataSourceProperties().initializeDataSourceBuilder().build(); } 

Nelle proprietà dell’applicazione, è ansible utilizzare le proprietà delle origini dati regolari:

 app.datasource.first.type=com.zaxxer.hikari.HikariDataSource app.datasource.first.maximum-pool-size=30 app.datasource.second.url=jdbc:mysql://localhost/test app.datasource.second.username=dbuser app.datasource.second.password=dbpass app.datasource.second.max-total=30 

Supponendo che tu abbia 2 fonti di dati, una per i metadati del lotto primaverile come i dettagli del lavoro [diciamo CONFIGDB] e altro per i tuoi dati aziendali [diciamo AppDB]:

Inject CONFIGDB in jobRepository, come questo:

        

Ora puoi iniettare il dartasource di AppDB nei tuoi O scrittori DAO, se ce ne sono ..

     

O

puoi definire una risorsa e Iniettare questo AppDB con jndi lookup nella class in cui è necessario:

 public class ExampleDAO { @Resource(lookup = "java:comp/env/jdbc/AppDB") DataSource ds; 

}