Uploaded image for project: 'Pentaho Data Integration - Kettle'
  1. Pentaho Data Integration - Kettle
  2. PDI-16325

spark-app-builder.sh: allow custom named PDI folders

    Details

    • Type: Bug
    • Status: Closed
    • Severity: Unknown
    • Resolution: Cannot Reproduce
    • Affects Version/s: 7.1.0 GA
    • Fix Version/s: None
    • Component/s: AEL
    • Labels:
      None
    • Story Points:
      0
    • Notice:
      When an issue is open, the "Fix Version/s" field conveys a target, not necessarily a commitment. When an issue is closed, the "Fix Version/s" field conveys the version that the issue was fixed in.

      Description

      As a developer I have several versions of PDI on my laptop and give them custom names. The `spark-app-builder.sh` requires the PDI folder to be called `data-integration`, otherwise the script will fail. Please change this behaviour.

        Activity

        Hide
        tkaszuba Tomasz Kaszuba added a comment -

        Tried to get around this limitation with a symlink but I get the following error:

        java.io.IOException: Is a directory
        	at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
        	at sun.nio.ch.FileDispatcherImpl.read(FileDispatcherImpl.java:46)
        	at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        	at sun.nio.ch.IOUtil.read(IOUtil.java:197)
        	at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159)
        	at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
        	at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109)
        	at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
        	at java.io.InputStream.read(InputStream.java:101)
        	at java.nio.file.Files.copy(Files.java:2908)
        	at java.nio.file.Files.copy(Files.java:3069)
        	at org.pentaho.pdi.spark.driver.app.builder.ArchiveBuilderFileVisitor.visitFile(ArchiveBuilderFileVisitor.java:95)
        	at org.pentaho.pdi.spark.driver.app.builder.ArchiveBuilderFileVisitor.visitFile(ArchiveBuilderFileVisitor.java:41)
        	at java.nio.file.Files.walkFileTree(Files.java:2670)
        	at java.nio.file.Files.walkFileTree(Files.java:2742)
        	at org.pentaho.pdi.spark.driver.app.builder.ZipBuilder.compress(ZipBuilder.java:76)
        	at org.pentaho.pdi.spark.driver.app.builder.SparkDriverAssembly.build(SparkDriverAssembly.java:53)
        	at org.pentaho.pdi.spark.driver.app.builder.SparkDriverAppBuilder.main(SparkDriverAppBuilder.java:48)
        	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        	at java.lang.reflect.Method.invoke(Method.java:498)
        	at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
        	Suppressed: java.io.IOException: This archives contains unclosed entries.
        		at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.finish(ZipArchiveOutputStream.java:413)
        		at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.close(ZipArchiveOutputStream.java:806)
        		at org.pentaho.pdi.spark.driver.app.builder.ZipBuilder.compress(ZipBuilder.java:98)
        		... 7 more
        
        Show
        tkaszuba Tomasz Kaszuba added a comment - Tried to get around this limitation with a symlink but I get the following error: java.io.IOException: Is a directory at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.FileDispatcherImpl.read(FileDispatcherImpl.java:46) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159) at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65) at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109) at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103) at java.io.InputStream.read(InputStream.java:101) at java.nio.file.Files.copy(Files.java:2908) at java.nio.file.Files.copy(Files.java:3069) at org.pentaho.pdi.spark.driver.app.builder.ArchiveBuilderFileVisitor.visitFile(ArchiveBuilderFileVisitor.java:95) at org.pentaho.pdi.spark.driver.app.builder.ArchiveBuilderFileVisitor.visitFile(ArchiveBuilderFileVisitor.java:41) at java.nio.file.Files.walkFileTree(Files.java:2670) at java.nio.file.Files.walkFileTree(Files.java:2742) at org.pentaho.pdi.spark.driver.app.builder.ZipBuilder.compress(ZipBuilder.java:76) at org.pentaho.pdi.spark.driver.app.builder.SparkDriverAssembly.build(SparkDriverAssembly.java:53) at org.pentaho.pdi.spark.driver.app.builder.SparkDriverAppBuilder.main(SparkDriverAppBuilder.java:48) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92) Suppressed: java.io.IOException: This archives contains unclosed entries. at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.finish(ZipArchiveOutputStream.java:413) at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.close(ZipArchiveOutputStream.java:806) at org.pentaho.pdi.spark.driver.app.builder.ZipBuilder.compress(ZipBuilder.java:98) ... 7 more
        Hide
        mbatchelor Marc Batchelor added a comment -

        The shell script doesn't require this at all. Please see the shell script as it exists in the 7.1.0.0-R tag:
        https://github.com/pentaho/pentaho-kettle/blob/7.1.0.0-R/assembly/package-res/spark-app-builder.sh

        This must be something environmental.

        Show
        mbatchelor Marc Batchelor added a comment - The shell script doesn't require this at all. Please see the shell script as it exists in the 7.1.0.0-R tag: https://github.com/pentaho/pentaho-kettle/blob/7.1.0.0-R/assembly/package-res/spark-app-builder.sh This must be something environmental.
        Hide
        diddy Diethard Steiner added a comment -

        I still have the same problem in v8. What kind of env variable would influence this?

        [dsteiner@localhost pdi-ce-8.0]$ sh ./spark-app-builder.sh
        #######################################################################
        WARNING: no libwebkitgtk-1.0 detected, some features will be unavailable
        Consider installing the package with apt-get or yum.
        e.g. 'sudo apt-get install libwebkitgtk-1.0-0'
        #######################################################################
        Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
        ERROR: pdiLocation must point to a valid data-integration folder. /home/dsteiner/apps/pdi-ce-8.0 is not valid.

        Show
        diddy Diethard Steiner added a comment - I still have the same problem in v8. What kind of env variable would influence this? [dsteiner@localhost pdi-ce-8.0] $ sh ./spark-app-builder.sh ####################################################################### WARNING: no libwebkitgtk-1.0 detected, some features will be unavailable Consider installing the package with apt-get or yum. e.g. 'sudo apt-get install libwebkitgtk-1.0-0' ####################################################################### Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0 ERROR: pdiLocation must point to a valid data-integration folder. /home/dsteiner/apps/pdi-ce-8.0 is not valid.

          People

          • Assignee:
            Unassigned
            Reporter:
            diddy Diethard Steiner
          • Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: