Package de.maramuse.soundcomp.files

Source Code of de.maramuse.soundcomp.files.InputFile

package de.maramuse.soundcomp.files;

/**
* This file presently is a stub, lacking important parts of its implementation, although for primitive cases
* it can be used. Remove this comment when it is no longer true.
*/

/*
* Copyright 2010 Jan Schmidt-Reinisch
*
* SoundComp - a sound processing library
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; in version 2.1
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with this library; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
*
*/

/**
* A file import class reading RIFF (wave) compatible (and similar) files.
* At first this class only provides wave file read access, and soon Monkey's Audio, FLAC should follow.
*
* RIFF wave file interpretation info:
*
*   Information on correct reading of WAV files can be found at
*    http://www.sonicspot.com/guide/wavefiles.html
*    http://www-mmsp.ece.mcgill.ca/documents/audioformats/wave/wave.html
*   You need to read both, as both of them miss or shorten important information,
*   but summing up the info from both it is possible to get a rather complete image.
*   SoundComp must be able to handle type 0 (PCM) and type 3 (IEEE_FLOAT) RIFF (WAV) files.
*   Note that there are substantial header differences between both, but SoundComp should be able
*   to also read files with missing (in case of FLOAT where it is mandatory by definition) or
*   superfluous (in case of PCM where it mustn't appear by definition) 'ExtraFormatBytes' fields,
*   since many file creators go wrong on this point in one way or the other.
*   SoundComp never writes "plst" and "slnt" chunks - but should be able to interpret files containing such.
*   Other chunks that are not meant to change data interpretation, as 'cue', 'labl', 'note, 'ltxt', 'fact',
*   and any other that don't have a meaning in SoundComp should presently be ignored gracefully.
*   Some of them may get a notion in SoundComp later.
*   Information from an 'inst' chunk, should it be available, should be stored and
*   kept retrievable by the control logic (for the case of sound bank/sampling workstation generators).
*
* A test case for each supported type should be created and tested under big and little endian.
*
* This DSP object offers the stored file sample rate on output FREQUENCY. It doesn't have any notion
* of any sound frequencies within the file. It is meant to be used in a context where the fundamental
* frequency of contained sound is either irrelevant or given by configuration.
*
* No resampling or pitch transposal is yet carried out. Such things should be possible to happen, but
* will come later. For this purpose, a FACTOR input is available that defines a pitch transposal factor.
* When the resampling is then carried out, FACTOR must be corrected by the sample rate ratio before resampling.
* An InputFile that is being transposed by a FACTOR should not get transposed by a frequency ratio of more
* than approx. 2^(1/6) or 1.125 in either direction. Pitch shifting should not employ TDHS but rather PSOLA
* to avoid glitches in the signal.
*
* The inputs IN, IN_IMAG and GATE are only considered when all of them are connected at once, and provide
* positive values:
* On Inputs IN and IN_IMAG it gets two indexes of samples that define a loop. If input GATE is high (>=0.5)
* when the sample with index IN_IMAG is being read, reading continues at index IN. This may skip some part
* of the file (if IN>IN_IMAG), but the intended way is to supply IN<IN_IMAG, in which case the file is looped
* between these two indexes as long as GATE is kept high. If any of these inputs remains unconnected, IN or
* IN_IMAG is less than zero, or IN_IMAG is larger than the sample count of the file, then the file is played
* in one go, then stopped. If IN is larger than the sample count of the file and GATE is high when the IN_IMAG-th
* sample is read, then the file playing stops at that point.
*
* On channels -1..-n it outputs the n channels of sound from the file. It always outputs at least 2
* channels, monaural files will duplicate the output into both channels. Stereo or multichannel
* files will not get downmixed - if less channels than the file contains are used, the others are disregarded.
* If more channels are queried than the file contains, and more than two, an error occurs.
* Channels -1 and -2 are mirrored on OUT and OUT_IMAG, to allow "usual" addressing of outputs for mono and stereo
* files.
* Output GATE reflects the read state of the file: 0=not reading, 1=reading. External logic may use the falling
* edge to detect the end of the file, to possibly terminate the enclosing event.
*
* TODO: caching of file content (optional, selectable via input), especially important when using compressed
*      data (flac, ape) where multiple decompression of the same file may be a CPU resource drain, but also
*      might help for WAV files on systems with enough RAM and slow drives
*/
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.InputStream;
import java.io.IOException;

import de.maramuse.soundcomp.process.NamedSource;
import de.maramuse.soundcomp.process.ParameterMap;
import de.maramuse.soundcomp.process.ProcessElement;
import de.maramuse.soundcomp.process.SourceStore;
import de.maramuse.soundcomp.process.StandardParameters;
import de.maramuse.soundcomp.process.Stateful;
import de.maramuse.soundcomp.process.TypeMismatchException;
import de.maramuse.soundcomp.process.UnknownConnectionException;
import de.maramuse.soundcomp.process.ValueType;
import de.maramuse.soundcomp.process.StandardParameters.Parameter;
import de.maramuse.soundcomp.util.GlobalParameters;
import de.maramuse.soundcomp.util.NotImplementedException;
import de.maramuse.soundcomp.util.ReadOnlyMap;
import de.maramuse.soundcomp.util.ReadOnlyMapImpl;
import de.maramuse.soundcomp.util.NativeObjects;

public class InputFile implements ProcessElement, Stateful {

  public long nativeSpace=-1;
  private String instanceName;
  private String abstractName;

  private ReadOnlyMapImpl<Integer, ValueType> srcMap=new ReadOnlyMapImpl<Integer, ValueType>();
  private static ReadOnlyMapImpl<Integer, ValueType> destMap=new ReadOnlyMapImpl<Integer, ValueType>();
  private ReadOnlyMapImpl<Integer, SourceStore> sourceStoreMap=new ReadOnlyMapImpl<Integer, SourceStore>();
  private SourceStore loopstart, loopend, gate, factor;
  private static ParameterMap inputsMap=new ParameterMap();
  private ParameterMap outputsMap=new ParameterMap();
  static{
  destMap.put(StandardParameters.GATE.i, ValueType.STREAM);
  destMap.put(StandardParameters.FREQUENCY.i, ValueType.STREAM);
  // TODO this should be a pitch adaption factor. Currently no pitch adaption or resampling is carried out,
  // but it _should_ be done. The resampling factor can be determined by the sample rate of the file,
  // compared to the sample rate of the current compilation. The pitch adaption factor needs to come from the
  // outside, for example from scale information if this file is being read as part of a SoundBank.
  inputsMap.put(StandardParameters.FACTOR);
  inputsMap.put(StandardParameters.IN);
  inputsMap.put(StandardParameters.IN_IMAG);
  inputsMap.put(StandardParameters.GATE);
  // TODO special handling, like mixer
  }

  private int nChannels;
  private final String name;
  private DataList[] samples;
  private int wSampleRate;
  private int wFileSampleRate;
  private int readIndex=-1;
  private boolean floatFormat;
  private double active=0.0;

  /**
   * Create a wave file input element for a n-channel InputStream. The exact format should get detected on opening the
   * file.
   *
   * @param name
   *          the name of the file data is read from
   */
  public InputFile(String name) {
  this.name=name;
  NativeObjects.registerNativeObject(this);
  InputStream stream;
  try{
    stream=new FileInputStream(name);
    detect(stream);
  }catch(FileNotFoundException e){
    nChannels=0;
  }catch(Exception e){
    nChannels=0;
  }
  wSampleRate=(int)GlobalParameters.get().getSampleRate();
  outputsMap.put(StandardParameters.FREQUENCY);
  srcMap.put(StandardParameters.FREQUENCY.i, ValueType.STREAM);
  // we have a special output called "alive" with index -1, non-standard, only for SoundBank
  outputsMap.put(StandardParameters.GATE);
  srcMap.put(StandardParameters.GATE.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT);
  srcMap.put(StandardParameters.OUT.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT_IMAG);
  srcMap.put(StandardParameters.OUT_IMAG.i, ValueType.STREAM);
  for(int ix=1;ix<=nChannels;ix++){
    outputsMap.put(new StandardParameters.Parameter("channel"+ix, -ix));
    srcMap.put(-ix, ValueType.STREAM);
  }
  }

  /**
   * Create a wave file input element for a n-channel InputStream. The exact format should get detected on opening the
   * file.
   *
   * @param stream
   *          the InputStream that the file data is read from
   */
  public InputFile(InputStream stream) {
  this.name=""; // not needed to have a name in that case
  NativeObjects.registerNativeObject(this);
  try{
    detect(stream);
  }catch(Exception ex){
    nChannels=0;
  }
  samples=new DataList[nChannels];
  for(int i=0; i<nChannels; i++)
    samples[i]=new DataList();
  wSampleRate=(int)GlobalParameters.get().getSampleRate();
  outputsMap.put(StandardParameters.FREQUENCY);
  srcMap.put(StandardParameters.FREQUENCY.i, ValueType.STREAM);
  // we have a special output called "alive" with index -1, non-standard, only for SoundBank
  outputsMap.put(StandardParameters.GATE);
  srcMap.put(StandardParameters.GATE.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT);
  srcMap.put(StandardParameters.OUT.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT_IMAG);
  srcMap.put(StandardParameters.OUT_IMAG.i, ValueType.STREAM);
  for(int ix=1;ix<=nChannels;ix++){
    outputsMap.put(new StandardParameters.Parameter("channel"+ix, -ix));
    srcMap.put(-ix, ValueType.STREAM);
  }
  }

  /**
   * This constructor will apply in the case that a file input element is to be created from C++ code. This is a yet
   * unsupported scenario, but should be implemented sometime. Setting name is not supported in that case, and the input
   * stream is to be generated from native code then. The number of channels must be determined from native code in
   * advance.
   *
   * @param nChannels
   *          the number of channels to be written to the file
   * @param s
   *          ignored, used for distinguishing constructors
   */
  InputFile(boolean s, int nChannels) {
  this.name=""; // not necessary to know, the inputstream will be set from native
  this.nChannels=nChannels;
  samples=new DataList[nChannels];
  for(int i=0; i<nChannels; i++)
    samples[i]=new DataList();
  wSampleRate=(int)GlobalParameters.get().getSampleRate();
  outputsMap.put(StandardParameters.FREQUENCY);
  srcMap.put(StandardParameters.FREQUENCY.i, ValueType.STREAM);
  // we have a special output called "alive" with index -1, non-standard, only for SoundBank
  outputsMap.put(StandardParameters.GATE);
  srcMap.put(StandardParameters.GATE.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT);
  srcMap.put(StandardParameters.OUT.i, ValueType.STREAM);
  outputsMap.put(StandardParameters.OUT_IMAG);
  srcMap.put(StandardParameters.OUT_IMAG.i, ValueType.STREAM);
  for(int ix=1;ix<=nChannels;ix++){
    outputsMap.put(new StandardParameters.Parameter("channel"+ix, -ix));
    srcMap.put(-ix, ValueType.STREAM);
  }
  }

  public int getNrSamples() {
  int nrSamples=0;
  for(DataList l:samples){
    if(l!=null&&l.size()>nrSamples)
    nrSamples=l.size();
  }
  return nrSamples;
  }

  /**
   * converts a variable-size byte array to the corresponding int value
   *
   * @param bytes
   * @return
   */
  private static int getInt(byte[] bytes) {
  if(bytes.length==2){
    return ((bytes[1]&255)<<8)|(bytes[0]&255);
  }else if(bytes.length==4){
    return ((((((bytes[3]&255)<<8)|(bytes[2]&255))<<8)|(bytes[1]&255))<<8)|(bytes[0]&255);
  }else
    return 0;
  }

  /**
   * attempt to find out the format of the stream data, and initialize the InputFile accordingly so that future read
   * operations correctly interpret the sampled data
   *
   * @param s
   *          the InputStream with header data
   * @return
   * @throws Exception
   */
  public void detect(InputStream s) throws Exception {
  byte lc[]=new byte[4];
  byte wc[]=new byte[2];
  String ename=(name!=null) ? name : " an unnamed sample stream";
  try{
    s.read(lc);
    if(lc[0]!='R'||lc[1]!='I'||lc[2]!='F'||lc[3]!='F'){
    // TODO: 'M' 'A' 'C' ' ' -> read monkey's audio
    // 'O' 'g' 'g' -> read ogg container (need to analyze further whether flac or vorbis content)
    // 'f' 'L' 'a' 'C' -> read flac (raw, no container)
    throw new Exception("cannot determine the format of"+ename
      +"; only RIFF wave files currently supported");
    }
    s.read(lc);
    int len=getInt(lc)-28; // size of the data chunks
    if(len<=0)
    throw new Exception("stream header of"+ename+" too short or file size info in header corrupt");
    s.read(lc);
    if(lc[0]!='W'||lc[1]!='A'||lc[2]!='V'||lc[3]!='E')
    throw new Exception("stream header of RIFF file"+ename+" not WAV compliant: format not 'WAVE', but '"+lc[0]+lc[1]+lc[2]+lc[3]+"'");
    s.read(lc);
    if(lc[0]!='f'||lc[1]!='m'||lc[2]!='t')
    throw new Exception("stream header of RIFF file"+ename+" not WAV compliant: format header not present or doesn't start with 'fmt'");
    s.read(lc);
    int fmtsize=getInt(lc);
    if(fmtsize<16)
    throw new Exception("stream header of RIFF file"+ename+" not WAV compliant: size of format header < 16 (is "+fmtsize+")");
    s.read(wc);
    int wFormat=getInt(wc);
    if(wFormat!=1&&wFormat!=3)
    throw new Exception("stream header of RIFF file"+ename+" neither 1=PCM nor 3=float, but instead "+wFormat);
    floatFormat=wFormat==3;
    s.read(wc);
    nChannels=getInt(wc);
    samples=new DataList[nChannels];
    for(int i=0; i<nChannels; i++)
    samples[i]=new DataList();
    s.read(lc);
    wFileSampleRate=getInt(lc);
    s.read(lc);
    /* int bytesPerSec = getInt(lc); we don't need this info */
    s.read(wc);
    /* int bytesPerSample = getInt(wc); we don't need this info */
    s.read(wc);
    int bitsPerSample=getInt(wc);
    int headerextensionsize=0;
    if(fmtsize==18)
    // if we expect a header extension field, read its size
    headerextensionsize=s.read(wc);
    else if(fmtsize!=16)
    throw new Exception("illegal fmt header size in RIFF header, only 16 or 18 supported");
    if(headerextensionsize!=0){
    if(headerextensionsize!=22)
      throw new Exception(
        "illegal fmt header extension size "+headerextensionsize+" in RIFF header, only 0 or 22 supported");
    // skip 22 bytes if header extension present. these would be:
    // 2 bytes valid bits per sample, to allow reduced precision samples
    // 4 bytes channel mask/speaker position mask, for N.M surround files
    // 16 bytes sub format GUID, includes leading 2 bytes data format code
    for(int i=0; i<11; i++)
      s.read(wc);
    }
    int rlen=0;
    int l;
    while(rlen<len){
    l=readRIFFChunk(s, bitsPerSample);
    if(l<=0)
      break;
    rlen+=l;
    }
  }catch(IOException ex){
    //
  }
  }

  private int readRIFFChunk(InputStream s, int bitsPerSample) throws Exception {
  byte lc[]=new byte[4];
  int len;
  try{
    s.read(lc);
    boolean isData=(lc[0]=='d')&&(lc[1]==lc[3])&&(lc[2]=='t')&&(lc[1]=='a');
    s.read(lc);
    len=getInt(lc);
    if((len&1)!=0)
    len++;
    if(isData){
    // read in the samples
    int bytesPerSample=((bitsPerSample+7)>>3);
    int nFrames=len/nChannels/bytesPerSample;
    int bytesPerFrame=bytesPerSample*nChannels;
    byte[] frame=new byte[bytesPerFrame];
    switch(bytesPerSample){
      case 1: // 8 bit int with offset
      for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
        samples[c].add(((frame[c]&255)-128)/128.0);
        }
      }
      break;
      case 2: // 16 bit int
      final double fact2=32768.0;
      for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
        int v=((frame[1+c*2]&255)<<8)+(frame[c*2]&255);
        if((v&32768)!=0)
          v-=65536;
        samples[c].add(v/fact2);
        }
      }
      break;
      case 3: // 24 bit int
      final double fact3=32768.0*256.0;
      for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
        int v=((frame[2+c*3]&255)<<16)+((frame[1+c*3]&255)<<8)+(frame[c*3]&255);
        if((v&0x800000)!=0)
          v-=0x1000000;
        samples[c].add(v/fact3);
        }
      }
      break;
      case 4: // 32 bit per sample can be either float or int
      if(floatFormat){
        for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
          int v=((frame[3+c*4]&255)<<24)+((frame[2+c*4]&255)<<16)+((frame[1+c*4]&255)<<8)
            +(frame[c*4]&255);
          samples[c].add(Float.intBitsToFloat(v));
        }
        }
      }else{
        final double fact4=32768.0*65536.0;
        for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
          long v=((frame[3+c*4]&255)<<24)+((frame[2+c*4]&255)<<16)+((frame[1+c*4]&255)<<8)
            +(frame[c*4]&255);
          if((v&0x80000000)!=0)
          v-=0x100000000L;
          samples[c].add(v/fact4);
        }
        }
      }
      break;
      case 8: // double
      if(floatFormat){
        for(int i=0; i<nFrames; i++){
        s.read(frame);
        for(int c=0; c<nChannels; c++){
          long ld=(((long)(frame[7+c*8]&255))<<56)+(((long)(frame[6+c*8]&255))<<48)
            +(((long)(frame[5+c*8]&255))<<40)+(((long)(frame[4+c*8]&255))<<32)
            +((frame[3+c*8]&255)<<24)+((frame[2+c*8]&255)<<16)+((frame[1+c*8]&255)<<8)
            +(frame[c*8]&255);
          double d=Double.longBitsToDouble(ld);
          samples[c].add(d);
        }
        }
      }else{
        throw new Exception("stream "+name
          +" contains 64-bit integer sample data, which is not supported");
      }
      break;
      default:
        throw new Exception("stream "+name
          +" contains "+bitsPerSample+" bit "+(floatFormat?"float":"integer")
          +" sample data, which is not supported (only 8i,16i,24i,32i,32f,64f allowed)");
    }
    }else{
    // just skip this chunk
    s.skip(len);
    }
  }catch(IOException ex){
    return -1;
  }
  return len+8;
  }

  // ///////////////////////////////////////////////////////////////////////////
  // standard getters/setters
  // ///////////////////////////////////////////////////////////////////////////

  public int getWSampleRate() {
  return wSampleRate;
  }

  public void setWSampleRate(int wSampleRate) {
  this.wSampleRate=wSampleRate;
  }

  public int getNChannels() {
  return nChannels;
  }

  public int getNSamples() {
  return samples[0].size();
  }

  public DataList getChannel(int ch) {
  if(ch<0||ch>=nChannels)
    throw new IllegalArgumentException("Channel "+ch+" does not exist");
  return samples[ch];
  }

  // ///////////////////////////////////////////////////////////////////////////
  // ProcessElement interface
  // ///////////////////////////////////////////////////////////////////////////

  @Override
  public ReadOnlyMap<Integer, ValueType> getDestinationTypes() {
  return destMap;
  }

  @Override
  public void setSource(int connectionIndex, NamedSource source,
            int sourceIndex) throws UnknownConnectionException,
      TypeMismatchException {
  if(connectionIndex==StandardParameters.IN.i){
    loopstart=new SourceStore(source, sourceIndex);
  }else if(connectionIndex==StandardParameters.IN_IMAG.i){
    loopend=new SourceStore(source, sourceIndex);
  }else if(connectionIndex==StandardParameters.GATE.i){
    gate=new SourceStore(source, sourceIndex);
  }else if(connectionIndex==StandardParameters.FACTOR.i){
    factor=new SourceStore(source, sourceIndex);
  }
  sourceStoreMap.put(connectionIndex, new SourceStore(source, sourceIndex));
  }

  @Override
  public ReadOnlyMap<Integer, ValueType> getSourceTypes() {
  return srcMap;
  }

  @Override
  public double getValue(int index) {
  int ix=-index;
  // FREQUENCY -> sample rate of file
  if(ix==0)
    return getFileSampleRate();
  // GATE -> are we still reading?
  if(-ix==StandardParameters.GATE.i)
    return active;
  // OUT -> first channel
  if(-ix==StandardParameters.OUT.i)
    ix=1;
  // OUT_IMAG -> second channel
  else if(-ix==StandardParameters.OUT_IMAG.i)
    ix=2;
  // channel 2 on mono file: duplicate only channel
  if(ix==2&&nChannels==1)ix=1;
  // ix too high otherwise: fail
  if(ix>nChannels)
    throw new IllegalArgumentException("cannot read channel "+ix+" of "+nChannels+" in file "
      +name);
  // ix=-1..-n: -ix-th channel, indexed as 0..n-1
  ix--;
  if(readIndex==-1)return 0; // file not yet processed
  if(readIndex<-1||readIndex>=samples[ix].size())
    throw new IllegalArgumentException("attempt to read sample "+readIndex+" of file "+name
      +" with "+samples[ix].size()+" samples");
  return samples[ix].get(readIndex);
  }

  @Override
  public void advanceOutput() {
  // nothing to do
  }

  @Override
  public void advanceState() {
  // do not attempt to read beyond end of input file
  if(readIndex<samples[0].size()-1){
    readIndex++;
    active=1.0;
  }else
    active=0.0;
  // loop if the conditions for looping are met
  if(loopstart!=null&&loopend!=null&&gate!=null){
    if(gate.getValue()>=0.5){
    int le=(int)loopend.getValue();
    if(le>=0 && readIndex>=le){
      int ri=(int)loopstart.getValue();
      if(ri>=0){
      if(ri<samples[0].size())
        readIndex=ri;
      else
        readIndex=samples[0].size()-1;
      }
    }
    }
  }
  }

  @Override
  public String getAbstractName() {
  return abstractName;
  }

  @Override
  public String getInstanceName() {
  return instanceName;
  }

  @Override
  public void setAbstractName(String abstractName) {
  this.abstractName=abstractName;
  }

  @Override
  public void setInstanceName(String instanceName) {
  this.instanceName=instanceName;
  }

  @Override
  public long getNativeSpace() {
  return nativeSpace;
  }

  // private void cleanupSubObjects(){
  //
  // }

  public double getFileSampleRate() {
  return wFileSampleRate;
  }

  @Override
  public ReadOnlyMap<Integer, SourceStore> getSourceMap() {
  return sourceStoreMap;
  }

  /**
   * @see de.maramuse.soundcomp.process.ProcessElement#clone()
   *
   *      But: this ProcessElement is usually not for use in single events Should we throw an Exception on cloning
   *      attempt? Maybe not, as we might have "voice templates" later on.
   */
  @Override
  public InputFile clone() {
  throw new NotImplementedException("InputFile cloning not yet supported");
  // InputFile c=new InputFile(nChannels);
  // c.abstractName=abstractName;
  // return c;
  }

  /*
   * (non-Javadoc)
   *
   * @see de.maramuse.soundcomp.process.ProcessElement#outputsByName()
   */
  @Override
  public ReadOnlyMap<String, Parameter> outputsByName() {
  return outputsMap;
  }

  /*
   * (non-Javadoc)
   *
   * @see de.maramuse.soundcomp.process.ProcessElement#inputsByName()
   */
  @Override
  public ReadOnlyMap<String, Parameter> inputsByName() {
  return inputsMap;
  }
}
TOP

Related Classes of de.maramuse.soundcomp.files.InputFile

TOP
Copyright © 2018 www.massapi.com. All rights reserved.
All source code are property of their respective owners. Java is a trademark of Sun Microsystems, Inc and owned by ORACLE Inc. Contact coftware#gmail.com.