Upto: Table of Contents of full book "Programming and Using Linux Sound"

JavaSound

JavaSound has no direct support for Karaoke. This chapter looks at how to combine the JavaSound libraries with other libraries such as Swing to give a Karaoke player for MIDI files.

Resources

Introduction

Java has no library support for Karaoke. That is too application specific. In this chapter we give code for a Karaoke player that can play KAR files. The player will show two lines of the lyrics to be played, with words already played beigin highlighted in red. Along the top it shows using a simple piano keyboard the notes that are played in channel one of the MIDI file. In the middle it shows the melody line, with a vertical line in the middle to show the currently playing note.

The player looks like

The UML diagram is basically

KaraokePlayer

The KaraokePlayer class extracts the filename of the Karaoke file and creates a MidiPlayer to handle the file:

      
      /*
 * KaraokePlayer.java
 *
 */

import javax.swing.*;

public class KaraokePlayer {


    public static void main(String[] args) throws Exception {
	if (args.length != 1) {
	    System.err.println("KaraokePlayer: usage: " +
			     "KaraokePlayer <midifile>");
	    System.exit(1);
	}
	String	strFilename = args[0];

	MidiPlayer midiPlayer = new MidiPlayer();
	midiPlayer.playMidiFile(strFilename);
    }
}




      
    

MidiPlayer

The MidiPlayer creates a Sequence from the file. Sequence information is required at many places, and so rather than pass the sequence in parameters, it is stored in a singleton (static) object, a SequenceInformation. This makes the sequence effectively a global object to the system.

The player then gets the default sequencer and transmits MIDI events to two Receiver objects: the default synthesizer to play the events and a DisplayReceiver to manage all the GUI handling. The Sequencer method getTransmitter() is mis-named: each time it is called it returns a new transmitter, each playing the same events to their respective receivers. From Java SE Documentation: Chapter 10: Transmitting and Receiving MIDI Messages

This code [in their example] introduces a dual invocation of the MidiDevice.getTransmitter method, assigning the results to inPortTrans1 and inPortTrans2. As mentioned earlier, a device can own multiple transmitters and receivers. Each time MidiDevice.getTransmitter() is invoked for a given device, another transmitter is returned, until no more are available, at which time an exception will be thrown.

That way, the sequencer can send to two receivers.

Receivers do not get MetaMessages. These contain information such as Text or Lyric events. The DisplayReceiver is registered as an MetaEventListener so that it can manage these events as well as other events.

The MidiPlayer is:

      
      

import javax.sound.midi.MidiSystem;
import javax.sound.midi.InvalidMidiDataException;
import javax.sound.midi.Sequence;
import javax.sound.midi.Receiver;
import javax.sound.midi.Sequencer;
import javax.sound.midi.Transmitter;
import javax.sound.midi.MidiChannel;
import javax.sound.midi.MidiDevice;
import javax.sound.midi.Synthesizer;
import javax.sound.midi.ShortMessage;
import javax.sound.midi.SysexMessage;

import java.io.File;
import java.io.IOException;

public class MidiPlayer {

    private DisplayReceiver receiver;

    public  void playMidiFile(String strFilename) throws Exception {
	File	midiFile = new File(strFilename);

	/*
	 *	We try to get a Sequence object, loaded with the content
	 *	of the MIDI file.
	 */
	Sequence	sequence = null;
	try {
	    sequence = MidiSystem.getSequence(midiFile);
	}
	catch (InvalidMidiDataException e) {
	    e.printStackTrace();
	    System.exit(1);
	}
	catch (IOException e) {
	    e.printStackTrace();
	    System.exit(1);
	}

	if (sequence == null) {
	    out("Cannot retrieve Sequence.");
	} else {
	    SequenceInformation.setSequence(sequence);
	    playMidi(sequence);
	}
    }

    public  void playMidi(Sequence sequence) throws Exception {

        Sequencer sequencer = MidiSystem.getSequencer(true);
        sequencer.open();
        sequencer.setSequence(sequence); 

	receiver = new DisplayReceiver(sequencer);
	sequencer.getTransmitter().setReceiver(receiver);
	sequencer.addMetaEventListener(receiver);

	if (sequencer instanceof Synthesizer) {
	    Debug.println("Sequencer is also a synthesizer");
	} else {
	    Debug.println("Sequencer is not a synthesizer");
	}
        //sequencer.start();

	/*
	Synthesizer synthesizer = MidiSystem.getSynthesizer();  
	synthesizer.open();  

	if (synthesizer.getDefaultSoundbank() == null) {
	    // then you know that java sound is using the hardware soundbank
	    Debug.println("Synthesizer using h/w soundbank");
	} else Debug.println("Synthesizer using s/w soundbank");


	Receiver synthReceiver = synthesizer.getReceiver();  
	Transmitter seqTransmitter = sequencer.getTransmitter();  
	seqTransmitter.setReceiver(synthReceiver); 
	MidiChannel[] channels = synthesizer.getChannels(); 
	Debug.println("Num channels is " + channels.length);
	*/
        sequencer.start();

	/* default synth doesn't support pitch bending
	Synthesizer synthesizer = MidiSystem.getSynthesizer();  
	MidiChannel[] channels = synthesizer.getChannels(); 
	for (int i = 0; i < channels.length; i++) {
	    System.out.printf("Channel %d has bend %d\n", i, channels[i].getPitchBend());
	    channels[i].setPitchBend(16000);
	    System.out.printf("Channel %d now has bend %d\n", i, channels[i].getPitchBend());
	}
	*/

	/* set volume - doesn't work */
	/*
	for (int i = 0; i < channels.length; i++) {
	    channels[i].controlChange(7, 0);
	}
	*/
	/*
	System.out.println("Turning notes off");
	for (int i = 0; i < channels.length; i++) {
	    channels[i].allNotesOff();
	    channels[i].allSoundOff();
	}
	*/

	/* set volume - doesn't work either */
	/*
	try {
	    Thread.sleep(5000);
	} catch (InterruptedException e) {
	    // TODO Auto-generated catch block
	    e.printStackTrace();
	}
	
	if (synthReceiver == MidiSystem.getReceiver()) 
	    System.out.println("Reciver is default");
	else
	    System.out.println("Reciver is not default");
	System.out.println("Receiver is " + synthReceiver.toString());
	//synthReceiver = MidiSystem.getReceiver();
	System.out.println("Receiver is now " + synthReceiver.toString());
	ShortMessage volMessage = new ShortMessage();
	int midiVolume = 1;
	for (Receiver rec: synthesizer.getReceivers()) {
	    System.out.println("Setting vol on recveiver " + rec.toString());
	for (int i = 0; i < channels.length; i++) {
	    try {
		// volMessage.setMessage(ShortMessage.CONTROL_CHANGE, i, 123, midiVolume);
		volMessage.setMessage(ShortMessage.CONTROL_CHANGE, i, 7, midiVolume);
	    } catch (InvalidMidiDataException e) {
		e.printStackTrace();
}
	    synthReceiver.send(volMessage, -1);
	    rec.send(volMessage, -1);
	}
	}
	System.out.println("Changed midi volume");
	*/
	/* master volume control using sysex */
	/* http://www.blitter.com/~russtopia/MIDI/~jglatt/tech/midispec/mastrvol.htm */
	/*
	SysexMessage sysexMessage = new SysexMessage();
	/* volume values from http://www.bandtrax.com.au/sysex.htm */
	/* default volume 0x7F * 128 + 0x7F from */
	/*
	byte[] data = {(byte) 0xF0, (byte) 0x7F, (byte) 0x7F, (byte) 0x04, 
		       (byte) 0x01, (byte) 0x0, (byte) 0x7F, (byte) 0xF7};
	sysexMessage.setMessage(data, data.length);
	synthReceiver.send(sysexMessage, -1);
	for (Receiver rec: synthesizer.getReceivers()) {
	    System.out.println("Setting vol on recveiver " + rec.toString());
	    rec.send(sysexMessage, -1);
	}
	*/
     }


    public DisplayReceiver getReceiver() {
	return receiver;
    }

    private static void out(String strMessage)
    {
	System.out.println(strMessage);
    }
}
      
    

DisplayReceiver

The DisplayReceiver collects both ShortMessages as a Receiver and MetaMessages as a MetaEventListener. These are needed to see both the notes and the lyrics.

The DisplayReceiver decodes the notes and text sent to it. In turn, it passes these to a MidiGUI to show them. This class is

      
      /**
 * DisplayReceiver
 *
 * Acts as a Midi receiver to the default Java Midi sequencer.
 * It collects Midi events and Midi meta messages from the sequencer.
 * these are handed to a UI object for display.
 *
 * The current UI object is a MidiGUI but could be replaced.
 */

import javax.sound.midi.*;
import javax.swing.SwingUtilities;

public class DisplayReceiver implements Receiver, 
					MetaEventListener {
    private MidiGUI gui;
    private Sequencer sequencer;
    private int melodyChannel = SequenceInformation.getMelodyChannel();

    public DisplayReceiver(Sequencer sequencer) {
	this.sequencer = sequencer;
	gui = new MidiGUI(sequencer);
    }

    public void close() {
    }

    /**
     * Called by a Transmitter to receive events
     * as a Receiver
     */
    public void send(MidiMessage msg, long timeStamp) {
	// Note on/off messages come from the midi player
	// but not meta messages

	if (msg instanceof ShortMessage) {
	    ShortMessage smsg = (ShortMessage) msg;

		
	    String strMessage = "Channel " + smsg.getChannel() + " ";
		
	    switch (smsg.getCommand())
		{
		case Constants.MIDI_NOTE_OFF:
		    strMessage += "note Off " + 
			getKeyName(smsg.getData1()) + " " + timeStamp;
		    break;
			
		case Constants.MIDI_NOTE_ON:
		    strMessage += "note On " + 
			getKeyName(smsg.getData1()) + " " + timeStamp;
		    break;
		}
	    Debug.println(strMessage);
	    if (smsg.getChannel() == melodyChannel) {
		gui.setNote(timeStamp, smsg.getCommand(), smsg.getData1());
	    }
		
	}
    }
					    

    public void meta(MetaMessage msg) {
	Debug.println("Reciever got a meta message");
	if (((MetaMessage) msg).getType() == Constants.MIDI_TEXT_TYPE) {
	    setLyric((MetaMessage) msg);
	} else if (((MetaMessage) msg).getType() == Constants.MIDI_END_OF_TRACK)  {
	    System.exit(0);
	}
    }

    public void setLyric(MetaMessage message) {
	byte[] data = message.getData();
	String str = new String(data);
	Debug.println("Lyric +\"" + str + "\" at " + sequencer.getTickPosition());
	gui.setLyric(str);

    }

    private static String[] keyNames = {"C", "C#", "D", "D#", "E", "F", "F#", "G", "G#", "A", "A#", "B"};

    public static String getKeyName(int keyNumber) {
        if (keyNumber > 127) {
	    return "illegal value";
	} else {
	    int note = keyNumber % 12;
	    int octave = keyNumber / 12;
	    return keyNames[note] + (octave - 1);
	}
    }

}

      
    

MidiGUI

The MidiGUI is called by two methods: setLyric() and setNote(). The GUI consists of three main areas: an area to give a "piano" view of the melody as it is played (pianoPanel), an area to show the complete set of melody notes (melodyPanel) and a set of Panels to show the lyrics. setNote() is fairly straightforward in that it just calls drawNote() in the pianoPanel. setLyric() is considerably more complex.

Most Karaoke players show a couple of lines of text for the lyrics. As lyrics are played, typically the text will change colour to match. When the end of a line is reached, focus will switch to the next line, and the previous line will be replaced with another line of lyrics.

Each line must hold a line of lyrics. The line must be able to react to lyrics as they are played. This is handled by an AttributedTextPanel, shown later. The main task is to feed changes in lyrics through to the selected panel so that it can display them in the correct colours.

The other principal task for the MidiGUI here is to switch focus between AttributedTextPanel's when end of line is detected and to update the next line of text. The new lines of text can't come from the lyrics as they are played, but must instead be constructed from the sequence containing all of the notes and lyrics. The convenience class SequenceInformation (shown later) takes a Sequence object and has a method to extract an array of LyricLine objects. Each panel displaying a line is given a line from this array.

      
      

import javax.swing.*;
import java.awt.*;
import java.awt.event.*;
import javax.sound.midi.*;
import java.util.Vector;
import java.util.Map;
import java.io.*;


public class MidiGUI extends JFrame {
    //private GridLayout mgr = new GridLayout(3,1);
    private BorderLayout mgr = new BorderLayout();

    private PianoPanel pianoPanel;
    private MelodyPanel melodyPanel;

    private AttributedLyricPanel lyric1;
    private AttributedLyricPanel lyric2;
    private AttributedLyricPanel[] lyricLinePanels;
    private int whichLyricPanel = 0;

    private JPanel lyricsPanel = new JPanel();

    private Sequencer sequencer;
    private Sequence sequence;
    private Vector<LyricLine> lyricLines;

    private int lyricLine = -1;

    private boolean inLyricHeader = true;
    private Vector<DurationNote> melodyNotes;

    private Map<Character, String> pinyinMap;

    private int language;

    public MidiGUI(final Sequencer sequencer) {
	this.sequencer = sequencer;
	sequence = sequencer.getSequence();

	// get lyrics and notes from Sequence Info
	lyricLines = SequenceInformation.getLyrics();
	melodyNotes = SequenceInformation.getMelodyNotes();
	language = SequenceInformation.getLanguage();

	pianoPanel = new PianoPanel(sequencer);
	melodyPanel = new MelodyPanel(sequencer);

	pinyinMap = CharsetEncoding.loadPinyinMap();
	lyric1 = new AttributedLyricPanel(pinyinMap);
	lyric2 = new AttributedLyricPanel(pinyinMap);
	lyricLinePanels = new AttributedLyricPanel[] {
	    lyric1, lyric2};

	Debug.println("Lyrics ");

	for (LyricLine line: lyricLines) {
	    Debug.println(line.line + " " + line.startTick + " " + line.endTick +
			  " num notes " + line.notes.size());
	}

	getContentPane().setLayout(mgr);
	/*
	getContentPane().add(pianoPanel);
	getContentPane().add(melodyPanel);

	getContentPane().add(lyricsPanel);
	*/
	getContentPane().add(pianoPanel, BorderLayout.PAGE_START);
	getContentPane().add(melodyPanel,  BorderLayout.CENTER);

	getContentPane().add(lyricsPanel,  BorderLayout.PAGE_END);


	lyricsPanel.setLayout(new GridLayout(2, 1));
	lyricsPanel.add(lyric1);
	lyricsPanel.add(lyric2);
	setLanguage(language);

	setText(lyricLinePanels[whichLyricPanel], lyricLines.elementAt(0).line);

	Debug.println("First lyric line: " + lyricLines.elementAt(0).line);
	if (lyricLine < lyricLines.size() - 1) {
	    setText(lyricLinePanels[(whichLyricPanel+1) % 2], lyricLines.elementAt(1).line);
	    Debug.println("Second lyric line: " + lyricLines.elementAt(1).line);
	}

	// handle window closing
	setDefaultCloseOperation(JFrame.DO_NOTHING_ON_CLOSE);
	addWindowListener(new WindowAdapter() {
		public void windowClosing(WindowEvent e) {
		    sequencer.stop();
		    System.exit(0);
                }
            });

	// handle resize events
	addComponentListener(new ComponentAdapter() {
		public void componentResized(ComponentEvent e) {
		    Debug.printf("Component has resized to width %d, height %d\n",
				      getWidth(), getHeight());
		    // force resize of children - especially the middle MelodyPanel
		    e.getComponent().validate();        
		}
		public void componentShown(ComponentEvent e) {
		    Debug.printf("Component is visible with width %d, height %d\n",
				      getWidth(), getHeight());          
		}
	    });

	setSize(1600, 900);
	setVisible(true);
    }

    public void setLanguage(int lang) {
	lyric1.setLanguage(lang);
	lyric2.setLanguage(lang);
    }


    /**
     * A lyric starts with a header section
     * We have to skip over that, but can pick useful
     * data out of it
     */

    /**
     * header format is
     *   \@Llanguage code
     *   \@Ttitle
     *   \@Tsinger
     */
 
    public void setLyric(String txt) {
	Debug.println("Setting lyric to " + txt);
	if (inLyricHeader) {
	    if (txt.startsWith("@")) {
		Debug.println("Header: " + txt);
		return;
	    } else {
		inLyricHeader = false;
	    }
	}
	
	if ((lyricLine == -1) && (txt.charAt(0) == '\\')) {
	    lyricLine = 0;
	    colourLyric(lyricLinePanels[whichLyricPanel], txt.substring(1));
	    // lyricLinePanels[whichLyricPanel].colourLyric(txt.substring(1));
	    return;
	}
	
	if (txt.equals("\r\n") || (txt.charAt(0) == '/') || (txt.charAt(0) == '\\')) {
	    if (lyricLine < lyricLines.size() -1)
		Debug.println("Setting next lyric line to \"" + 
			      lyricLines.elementAt(lyricLine + 1).line + "\"");

	    final int thisPanel = whichLyricPanel;
	    whichLyricPanel = (whichLyricPanel + 1) % 2;
	    
	    Debug.println("Setting new lyric line at tick " + 
			  sequencer.getTickPosition());
	    
	    lyricLine++;

	    // if it's a \ r /, the rest of the txt should be the next  word to
	    // be coloured

	    if ((txt.charAt(0) == '/') || (txt.charAt(0) == '\\')) {
		Debug.println("Colouring newline of " + txt);
		colourLyric(lyricLinePanels[whichLyricPanel], txt.substring(1));
	    }

	    // Update the current line of text to show the one after next
	    // But delay the update until 0.25 seconds after the next line
	    // starts playing, to preserve visual continuity
	    if (lyricLine + 1 < lyricLines.size()) {
		/*
		  long startNextLineTick = lyricLines.elementAt(lyricLine).startTick;
		  long delayForTicks = startNextLineTick - sequencer.getTickPosition();
		  Debug.println("Next  current "  + startNextLineTick + " " + sequencer.getTickPosition());
		  float microSecsPerQNote = sequencer.getTempoInMPQ();
		  float delayInMicroSecs = microSecsPerQNote * delayForTicks / 24 + 250000L;
		*/

		final Vector<DurationNote> notes = lyricLines.elementAt(lyricLine).notes;

		final int nextLineForPanel = lyricLine + 1;

		if (lyricLines.size() >= nextLineForPanel) {
		    Timer timer = new Timer((int) 1000,
					    new ActionListener() {
						public void actionPerformed(ActionEvent e) {
						    if (nextLineForPanel >= lyricLines.size()) {
							return;
						    }
						    setText(lyricLinePanels[thisPanel], lyricLines.elementAt(nextLineForPanel).line);
						    //lyricLinePanels[thisPanel].setText(lyricLines.elementAt(nextLineForPanel).line);
						
						}
					    });
		    timer.setRepeats(false);
		    timer.start();
		} else {
		    // no more lines
		}		
	    }
	} else {
	    Debug.println("Playing lyric " + txt);
	    colourLyric(lyricLinePanels[whichLyricPanel], txt);
	    //lyricLinePanels[whichLyricPanel].colourLyric(txt);
	}
    }

    /**
     * colour the lyric of a panel.
     * called by one thread, makes changes in GUI thread
     */
    private void colourLyric(final AttributedLyricPanel p, final String txt) {
	SwingUtilities.invokeLater(new Runnable() {
		public void run() {
		    Debug.print("Colouring lyric \"" + txt + "\"");
		    if (p == lyric1) Debug.println(" on panel 1");
		    else Debug.println(" on panel 2");
		    p.colourLyric(txt);
		}
	    }
	    );
    }

    /**
     * set the lyric of a panel.
     * called by one thread, makes changes in GUI thread
     */
    private void setText(final AttributedLyricPanel p, final String txt) {
	SwingUtilities.invokeLater(new Runnable() {
		public void run() {
		    Debug.println("Setting text \"" + txt + "\"");
		    if (p == lyric1) Debug.println(" on panel 1");
		    else Debug.println(" on panel 2");
		    p.setText(txt);
		}
	    }
	    );
    }

    public void setNote(long timeStamp, int onOff, int note) {
	Debug.printf("Setting note in gui to %d\n", note);

	if (onOff == Constants.MIDI_NOTE_OFF) {
	    pianoPanel.drawNoteOff(note);
	} else if (onOff == Constants.MIDI_NOTE_ON) {
	    pianoPanel.drawNoteOn(note);
	}
    }
}



      
    

AttributedLyricPanel

The panel to display a line of lyrics must be able to show text in two colours: the lyrics already played and the lyrics yet to be played. The Java AttributedText class is useful for this, as the text can be marked with different attributes such as colours. This is wrapped in an AttributedTextPanel, shown later.

One minor wrinkle concerns language. Chinese has both the character form and a Romanised form called PinYin. Chinese speakers can read the character form. People like me can only understand the PinYin form. So if the language is Chinese then the AttributedTextPanel shows the PinYin alongside the Chinese characters. The language identity should be passed to the AttributedLyricPanel as well.

The AttributedLyricPanel is

      
      


import javax.swing.*;
import java.awt.*;
import java.awt.font.*;
import java.text.*;
import java.util.Map;


public class AttributedLyricPanel extends JPanel {

    private final int PINYIN_Y = 40;
    private final int TEXT_Y = 90;

    private String text;
    private AttributedString attrText;
    private int coloured = 0;
    private Font font = new Font(Constants.CHINESE_FONT, Font.PLAIN, 36);
    private Font smallFont = new Font(Constants.CHINESE_FONT, Font.PLAIN, 24);
    private Color red = Color.RED;
    private int language;
    private String pinyinTxt = null;

    private Map<Character, String> pinyinMap = null;

    public AttributedLyricPanel(Map<Character, String> pinyinMap) {
	this.pinyinMap = pinyinMap;
    }

    public Dimension getPreferredSize() {
	return new Dimension(1000, TEXT_Y + 20);
    }

    public void setLanguage(int lang) {
	language = lang;
	Debug.printf("Lang in panel is %X\n", lang);
    }
	
    public boolean isChinese() {
	switch (language) {
	case SongInformation.CHINESE1:
	case SongInformation.CHINESE2:
	case SongInformation.CHINESE8:
	case SongInformation.CHINESE131:
	case SongInformation.TAIWANESE3:
	case SongInformation.TAIWANESE7:
	case SongInformation.CANTONESE:
	    return true;
	}
	return false;
    }

    public void setText(String txt) {
	coloured = 0;
	text = txt;
	Debug.println("set text " + text);
	attrText = new AttributedString(text);
	if (text.length() == 0) {
	    return;
	}
	attrText.addAttribute(TextAttribute.FONT, font, 0, text.length());

	if (isChinese()) {
	    pinyinTxt = "";
	    for (int n = 0; n < txt.length(); n++) {
		char ch = txt.charAt(n);
		String pinyin = pinyinMap.get(ch);
		if (pinyin != null) {
		    pinyinTxt += pinyin + " ";
		} else {
		    Debug.printf("No pinyin map for character \"%c\"\n", ch);
		}
	    }
		
	}

	repaint();
    }
	
    public void colourLyric(String txt) {
	coloured += txt.length();
	if (coloured != 0) {
	    repaint();
	}
    }
	
    /**
     * Draw the string with the first part in red, rest in green.
     * String is centred
     */
	 
    @Override
    public void paintComponent(Graphics g) {
	if ((text.length() == 0) || (coloured > text.length())) {
	    return;
	}
	g.setFont(font);
	FontMetrics metrics = g.getFontMetrics();
	int strWidth = metrics.stringWidth(text);
	int panelWidth = getWidth();
	int offset = (panelWidth - strWidth) / 2;

	if (coloured != 0) {
	    try {
		attrText.addAttribute(TextAttribute.FOREGROUND, red, 0, coloured);
	    } catch(Exception e) {
		System.out.println(attrText.toString() + " " + e.toString());
	    }
	}
	g.clearRect(0, 0, getWidth(), getHeight());
	try {
	    g.drawString(attrText.getIterator(), offset, TEXT_Y); 
	} catch (Exception e) {
	    System.err.println("Attr Str exception on " + text);
	}
	// Draw the Pinyin if it's not zero
	if (pinyinTxt != null && pinyinTxt.length() != 0) {
	    g.setFont(smallFont);
	    metrics = g.getFontMetrics();
	    strWidth = metrics.stringWidth(pinyinTxt);
	    offset = (panelWidth - strWidth) / 2;

	    g.drawString(pinyinTxt, offset, PINYIN_Y);
	    g.setFont(font);
	}
    }
}
      
    

PianoPanel

The pianoPanel shows a piano-like keyboard. As a note is turned on, it colours the note in blue and returns to normal any previously playing note. When a note is turned off, the note reverts to its normal colour (black or white).

Colouring notes is called by setNote as nonte on/note off messsages come from the sequencer.

The PianoPanel is

      
      
import java.util.Vector;
import javax.swing.*;
import java.awt.*;
import javax.sound.midi.*;


public class PianoPanel extends JPanel {

    private final int HEIGHT = 100;
    private final int HEIGHT_OFFSET = 10;

    long timeStamp;
    private Vector<DurationNote> notes;
    private Vector<DurationNote> sungNotes;
    private int lastNoteDrawn = -1;
    private Sequencer sequencer;
    private Sequence sequence;
    private int maxNote;
    private int minNote;

    private Vector<DurationNote> unresolvedNotes = new Vector<DurationNote> ();

    private int playingNote = -1;

    public PianoPanel(Sequencer sequencer) {

	maxNote = SequenceInformation.getMaxMelodyNote();
	minNote = SequenceInformation.getMinMelodyNote();
	Debug.println("Max: " + maxNote + " Min " + minNote);
    }

    public Dimension getPreferredSize() {
	return new Dimension(1000, 120);
    }

    public void drawNoteOff(int note) {
	if (note < minNote || note > maxNote) {
	    return;
	}

	Debug.println("Note off played is " + note);
	if (note != playingNote) {
	    // Sometimes "note off" followed immediately by "note on"
	    // gets mixed up to "note on" followed by "note off".
	    // Ignore the "note off" since the next note has already
	    // been processed
	    Debug.println("Ignoring note off");
	    return;
	}
	playingNote = -1;
	repaint();
    }

    public void drawNoteOn(int note) {
	if (note < minNote || note > maxNote) {
	    return;
	}

	Debug.println("Note on played is " + note);
	playingNote = note;
	repaint();

    }



    private void drawPiano(Graphics g, int width, int height) {
	int noteWidth = width / (Constants.MIDI_NOTE_C8 - Constants.MIDI_NOTE_A0);
	for (int noteNum =  Constants.MIDI_NOTE_A0; // A0
	     noteNum <=  Constants.MIDI_NOTE_C8; // C8
	     noteNum++) {
	    
	    drawNote(g, noteNum, noteWidth);
	}
    }

    private void drawNote(Graphics g, int noteNum, int width) {
	if (isWhite(noteNum)) {
	    noteNum -= Constants.MIDI_NOTE_A0;
	    g.setColor(Color.WHITE);
	    g.fillRect(noteNum*width, HEIGHT_OFFSET, width, HEIGHT);
	    g.setColor(Color.BLACK);
	    g.drawRect(noteNum*width, HEIGHT_OFFSET, width, HEIGHT);
	} else {
	    noteNum -= Constants.MIDI_NOTE_A0;
	    g.setColor(Color.BLACK);
	    g.fillRect(noteNum*width, HEIGHT_OFFSET, width, HEIGHT);
	}
	if (playingNote != -1) {
	    g.setColor(Color.BLUE);
	    g.fillRect((playingNote - Constants.MIDI_NOTE_A0) * width, HEIGHT_OFFSET, width, HEIGHT);
	}	    
    }

    private boolean isWhite(int noteNum) {
	noteNum = noteNum % 12;
	switch (noteNum) {
	case 1:
	case 3:
	case 6:
	case 8:
	case 10:
	case 13: 
	    return false;
	default:
	    return true;
	}
    }

    @Override
    public void paintComponent(Graphics g) {

	int ht = getHeight();
	int width = getWidth();

	drawPiano(g, width, ht);

    }
}
      
    

MelodyPanel

The MelodyPanel is a scrolling panel showing all the notes of the melody. The currently playing note is centred in the display. This is done by drawing all of the notes into a BufferedImage and then copying across the relevant part every 50 milliseconds.

The MelodyPanel is

      
      
import java.util.Vector;
import javax.swing.*;
import java.awt.*;
import java.awt.event.*;
import javax.sound.midi.*;
import java.awt.image.BufferedImage;
import java.io.*;
import javax.imageio.*;

public class MelodyPanel extends JPanel {

    private static int DBL_BUF_SCALE = 2;
    private static final int NOTE_HEIGHT = 10;
    private static final int SLEEP_MSECS = 5;

    private long timeStamp;
    private Vector<DurationNote> notes;
    private Sequencer sequencer;
    private Sequence sequence;
    private int maxNote;
    private int minNote;
    private long tickLength = -1;
    private long currentTick = -1;
    private Image image = null;

    /**
     * The panel where the melody notes are shown in a
     * scrolling panel 
     */
    public MelodyPanel(Sequencer sequencer) {

	maxNote = SequenceInformation.getMaxMelodyNote();
	minNote = SequenceInformation.getMinMelodyNote();
	Debug.println("Max: " + maxNote + " Min " + minNote);
	notes = SequenceInformation.getMelodyNotes();
	this.sequencer = sequencer;
	tickLength = sequencer.getTickLength() + 1000; // hack to make white space at end, plus fix bug

	//new TickPointer().start();
	// handle resize events
	addComponentListener(new ComponentAdapter() {
		public void componentResized(ComponentEvent e) {
		    Debug.printf("Component melody panel has resized to width %d, height %d\n",
				      getWidth(), getHeight());          
		}
		public void componentShown(ComponentEvent e) {
		    Debug.printf("Component malody panel is visible with width %d, height %d\n",
				      getWidth(), getHeight());          
		}
	    });

    }

    /**
     * Redraw the melody image after each tick
     * to give a scrolling effect
     */
    private class TickPointer extends Thread {
	public void run() {
	    while (true) {
		currentTick = sequencer.getTickPosition();
		MelodyPanel.this.repaint();
		/*
		SwingUtilities.invokeLater(
					    new Runnable() {
						public void run() {
						    synchronized(MelodyPanel.this) {
						    MelodyPanel.this.repaint();
						    }
						}
					    });
		*/
		try {
		    sleep(SLEEP_MSECS);
		} catch (Exception e) {
		    // keep going
		    e.printStackTrace();
		}
	    }
	}

    }

    /**
     * Draw the melody into a buffer so we can just copy bits to the screen
     */
    private void drawMelody(Graphics g, int front, int width, int height) {
	try {
	g.setColor(Color.WHITE);
	g.fillRect(0, 0, width, height);
	g.setColor(Color.BLACK);

	String title = SequenceInformation.getTitle();
	if (title != null) {
	    //Font f = new Font("SanSerif", Font.ITALIC, 40);
	    Font f = new Font(Constants.CHINESE_FONT, Font.ITALIC, 40);
	    g.setFont(f);
	    int strWidth = g.getFontMetrics().stringWidth(title);
	    g.drawString(title, (front - strWidth/2), height/2);
	    Debug.println("Drawn title " + title);
	}

	for (DurationNote note: notes) {
	    long startNote = note.startTick;
	    long endNote = note.endTick;
	    int value = note.note;

	    int ht = (value - minNote) * (height - NOTE_HEIGHT) / (maxNote - minNote) + NOTE_HEIGHT/2;
	    // it's upside down
	    ht = height - ht;

	    long start = front + (int) (startNote * DBL_BUF_SCALE);
	    long end = front + (int) (endNote * DBL_BUF_SCALE);

	    drawNote(g, ht, start, end);
	    //g.drawString(title, (int)start, (int)height/2);	
	}
	} catch(Exception e) {
	    System.err.println("Drawing melody error " + e.toString());
	}
    }

    /**
     * Draw a horizontal bar to represent a nore
     */
    private void drawNote(Graphics g, int height, long start, long end) {
	Debug.printf("Drawing melody at start %d end %d height %d\n", start, end,  height - NOTE_HEIGHT/2);

	g.fillRect((int) start, height - NOTE_HEIGHT/2, (int) (end-start), NOTE_HEIGHT);
    }

    /**
     * Draw a vertical line in the middle of the screen to
     * represent where we are in the playing notes
     */
    private void paintTick(Graphics g, long width, long height) {
	long x = (currentTick * width) / tickLength;
	g.drawLine((int) width/2, 0, (int) width/2, (int) height);
	//System.err.println("Painted tcik");
    }

    // leave space at the front of the image to draw title, etc
    int front = 1000;

    /**
     * First time, draw the melody notes into an off-screen buffer
     * After that, copy a segment of the buffer into the image,
     * with the centre of the image the current note
     */
    @Override
    public void paintComponent(Graphics g) {
	int ht = getHeight();
	int width = getWidth();
	//int front = width / 2;

	synchronized(this) {
	if (image == null) {
	    /*
	     * We want to stretch out the notes so that they appear nice and wide on the screen.
	     * A DBL_BUF_SCALE of 2 does this okay. But then tickLength * DBL_BUF_SCALE may end
	     * up larger than an int, and we can't make a BufferedImage wider than MAXINT.
	     * So we may have to adjust DBL_BUF_SCALE.
	     *
	     * Yes, I know we ask Java to rescale images on the fly, but that costs in runtime.
	     */
	    
	    Debug.println("tick*DBLBUFSCALE " + tickLength * DBL_BUF_SCALE);
	    
	    if ((long) (tickLength * DBL_BUF_SCALE) > (long) Short.MAX_VALUE) {
		// DBL_BUF_SCALE = ((float)  Integer.MAX_VALUE) / ((float) tickLength);
		DBL_BUF_SCALE = 1;
		Debug.println("Adjusted DBL_BUF_SCALE to "+ DBL_BUF_SCALE);
	    }
	    
	    Debug.println("DBL_BUF_SCALE is "+ DBL_BUF_SCALE);

	    // draw melody into a buffered image
	    Debug.printf("New buffered img width %d ht %d\n", tickLength, ht);
	    image = new BufferedImage(front + (int) (tickLength * DBL_BUF_SCALE), ht, BufferedImage.TYPE_INT_RGB);
	    Graphics ig = image.getGraphics();
	    drawMelody(ig, front, (int) (tickLength * DBL_BUF_SCALE), ht);
	    new TickPointer().start();


	    try {
		File outputfile = new File("saved.png");
		ImageIO.write((BufferedImage) image, "png", outputfile);
	    } catch (Exception e) {
		System.err.println("Error in image write " + e.toString());
	    }

	}
	//System.err.printf("Drawing img from %d ht %d width %d\n", 
	//		  front + (int) (currentTick * DBL_BUF_SCALE - width/2), ht, width);
	
	boolean b = g.drawImage(image, 0, 0, width, ht, 
				front + (int) (currentTick * DBL_BUF_SCALE - width/2), 0, 
				front + (int) (currentTick * DBL_BUF_SCALE + width/2), ht,
		    null);
	/*System.out.printf("Ht of BI %d, width %d\n", ((BufferedImage)image).getHeight(), 
			  ((BufferedImage) image).getWidth());
	*/
	
	//if (b) System.err.println("Drawn ok"); else System.err.println("NOt drawn ok");
	paintTick(g, width, ht);
	}
    }

}

      
    

SequenceInformation

SequenceInformation is given by

      
      

import javax.sound.midi.*;
import java.util.Vector;

public class SequenceInformation {

    private static Sequence sequence = null;
    private static Vector<LyricLine> lyricLines = null;
    private static Vector<DurationNote> melodyNotes = null;
    private static int lang = -1;
    private static String title = null;
    private static String performer = null;
    private static int maxNote;
    private static int minNote;

    private static int melodyChannel = -1;// no such value // default for Songken

    public static void setSequence(Sequence seq) {
	if (sequence != seq) {
	    sequence = seq;
	    lyricLines = null;
	    melodyNotes = null;
	    lang = -1;
	    title = null;
	    performer = null;
	    maxNote = -1;
	    minNote = Integer.MAX_VALUE;
	}
    }

    public static long getTickLength() {
	return sequence.getTickLength();
    }

    public static int getMelodyChannel() {
	boolean firstNoteSeen[] = {false, false, false, false, false, false, false, false,
				   false, false, false, false, false, false, false, false};
	boolean possibleChannel[] = {false, false, false, false, false, false, false, false,
				   false, false, false, false, false, false, false, false};
	if (melodyChannel != -1) {
	    return melodyChannel;
	}

	if (lyricLines == null) {
	    lyricLines = getLyrics();
	}

	long startLyricTick = ((LyricLine) lyricLines.get(0)).startTick;
	Debug.printf("Lyrics start at %d\n", startLyricTick);

	Track[]	tracks = sequence.getTracks();
	for (int nTrack = 0; nTrack < tracks.length; nTrack++) {
	    // out("Track " + nTrack + ":");
	    // out("-----------------------");
	    Track track = tracks[nTrack];
	    for (int nEvent = 0; nEvent < track.size(); nEvent++) {
		MidiEvent evt = track.get(nEvent);
		MidiMessage msg = evt.getMessage();
		if (msg instanceof ShortMessage) {
		    ShortMessage smsg= (ShortMessage) msg;
		    int channel = smsg.getChannel();
		    if (firstNoteSeen[channel]) {
			continue;
		    }
		    if (smsg.getCommand() == Constants.MIDI_NOTE_ON) {
			long tick = evt.getTick();
			Debug.printf("First note on for channel %d at tick %d\n",
					  channel, tick);
			firstNoteSeen[channel] = true;
			if (Math.abs(startLyricTick - tick) < 10) {
			    // close enough - we hope!
			    melodyChannel = channel;
			    possibleChannel[channel] = true;
			    Debug.printf("Possible melody channel is %d\n", channel);
			}
			if (tick > startLyricTick + 11) {
			    break;
			}
		    }
		}			  
					 


		//output(event);
	    }
	    // out("---------------------------------------------------------------------------");
	}

	return melodyChannel;
    }

    public static int getLanguage() {
	return lang;
    }

    public static String getTitle() {
	return title;
    }

    public static String getPerformer() {
	return performer;
    }

    public static void dumpLyrics() {
	for (LyricLine lyric: lyricLines) {
	    System.out.println(lyric.line);
	}
    }


    /**
     * Build a vector of lyric lines
     * Each line has a start and an end tick
     * and a string for the lyrics in that line
     */
    public static Vector<LyricLine> getLyrics() {
	if (lyricLines != null) {
	    return lyricLines;
	}

	lyricLines = new Vector<LyricLine> ();
	LyricLine nextLyricLine = new LyricLine();
	StringBuffer buff = new StringBuffer();
	long ticks = 0L;

	Track[] tracks = sequence.getTracks();
	for (int nTrack = 0; nTrack < tracks.length; nTrack++) {
	    for (int n = 0; n < tracks[nTrack].size(); n++) {
		MidiEvent evt = tracks[nTrack].get(n);
		MidiMessage msg = evt.getMessage();
		ticks = evt.getTick();

		if (msg instanceof MetaMessage) {
		    Debug.println("Got a meta mesg in seq");
		    if (((MetaMessage) msg).getType() == Constants.MIDI_TEXT_TYPE) {
			MetaMessage message = (MetaMessage) msg;

			byte[] data = message.getData();
			String str = new String(data);
			Debug.println("Got a text mesg in seq \"" + str + "\" " + ticks);

			if (ticks == 0) {
			    if (str.startsWith("@L")) {
				lang = decodeLang(str.substring(2));
			    } else if (str.startsWith("@T")) {
				if (title == null) {
				    title = str.substring(2);
				} else {
				    performer = str.substring(2);
				}
			    }
				
			}
			if (ticks > 0) {
			    //if (str.equals("\r") || str.equals("\n")) {
			    if ((data[0] == '/') || (data[0] == '\\')) {
				if (buff.length() == 0) {
				    // blank line -  maybe at start of song
				    // fix start time from NO_TICK
				    nextLyricLine.startTick = ticks;
				} else {
				    nextLyricLine.line = buff.toString();
				    nextLyricLine.endTick = ticks;
				    lyricLines.add(nextLyricLine);
				    buff.delete(0, buff.length());
				    
				    nextLyricLine = new LyricLine();
				}
				buff.append(str.substring(1));
			    } else {
				if (nextLyricLine.startTick == Constants.NO_TICK) {
				    nextLyricLine.startTick = ticks;
				}
				buff.append(str);
			    }
			}
		    }
		}
	    }
	    // save last line (but only once)
	    if (buff.length() != 0) {
		nextLyricLine.line = buff.toString();
		nextLyricLine.endTick = ticks;
		lyricLines.add(nextLyricLine);
		buff.delete(0, buff.length());		    
	    }
	}
	if (Debug.DEBUG) {
	    dumpLyrics();
	}
	return lyricLines;
    }

    /**
     * gets a vector of lyric notes
     * side-effect: sets last tick
     */
    public static Vector<DurationNote> getMelodyNotes() {
	if (melodyChannel == -1) {
	    getMelodyChannel();
	}

	if (melodyNotes != null) {
	    return melodyNotes;
	}

	melodyNotes = new Vector<DurationNote> ();
	Vector<DurationNote> unresolvedNotes = new Vector<DurationNote> ();
   
	Track[] tracks = sequence.getTracks();
	for (int nTrack = 0; nTrack < tracks.length; nTrack++) {
	    for (int n = 0; n < tracks[nTrack].size(); n++) {
		MidiEvent evt = tracks[nTrack].get(n);
		MidiMessage msg = evt.getMessage();
		long ticks = evt.getTick();

		if (msg instanceof ShortMessage) {
		    ShortMessage smsg= (ShortMessage) msg;
		    if (smsg.getChannel() == melodyChannel) {
			int note = smsg.getData1();
			if (note < Constants.MIDI_NOTE_A0 || note > Constants.MIDI_NOTE_C8) {
			    continue;
			}

			if (smsg.getCommand() == Constants.MIDI_NOTE_ON) {
			    // note on
			    DurationNote dnote = new DurationNote(ticks, note);
			    melodyNotes.add(dnote);
			    unresolvedNotes.add(dnote);

			} else if (smsg.getCommand() == Constants.MIDI_NOTE_OFF) {
			    // note off
			    for (int m = 0; m < unresolvedNotes.size(); m++) {
				DurationNote dnote = unresolvedNotes.elementAt(m);
				if (dnote.note == note) {
				    dnote.duration = ticks - dnote.startTick;
				    dnote.endTick = ticks;
				    unresolvedNotes.remove(m);
				}
			    }
				    
			}

		    }
		}
	    }
	}
	return melodyNotes;
    }

    public static int getMaxMelodyNote() {
	if (maxNote == -1) {
	    getMaxMin();
	}
	return maxNote;
    }

    public static int getMinMelodyNote() {
	if (minNote == Integer.MAX_VALUE) {
	    getMaxMin();
	}
	return minNote;
    }

    private static void getMaxMin() {
	StringBuffer buff = new StringBuffer();

	Track[] tracks = sequence.getTracks();
	for (int nTrack = 0; nTrack < tracks.length; nTrack++) {
	    for (int n = 0; n < tracks[nTrack].size(); n++) {
		MidiEvent evt = tracks[nTrack].get(n);
		MidiMessage msg = evt.getMessage();
		long ticks = evt.getTick();

		if (msg instanceof ShortMessage) {
		    ShortMessage smsg= (ShortMessage) msg;
		    if (smsg.getChannel() == melodyChannel) {
			if (smsg.getCommand() != Constants.MIDI_NOTE_OFF) {
			    // not note on
			    continue;
			}
			int note = smsg.getData1();
			if (note < 21 || note > 107) {
			    continue;
			}
			//Debug.println("Note is " + note);
			if (note > maxNote) {
			    Debug.println("Setting max to "+ note);
			    maxNote = note;
			} else if (note < minNote) {
			    minNote = note;
			}
		    }
		}
	    }
	}
    }

    private static int decodeLang(String str) {
	if (str.equals("ENG")) {
	    return SongInformation.ENGLISH;
	}
	if (str.equals("CHI")) {
	    return SongInformation.CHINESE1;
	}
	return -1;
    }
}

      
    

PinYin

For Chinese language files, one of my aims was to display the PinYin (Romanised form) of the Chinese hierographic characters. For this, I need to be able rewrite any sequence of Chinese characters into their PinYin form. I couldn't find a list of characters and their corresponding characters. The closest is the Chinese-English Dictionary from which you can download the dictionary as a text file. Typical lines in this file are

	
不賴 不赖 [bu4 lai4] /not bad/good/fine/
	
      

Each line has the Traditional characters followed by the Simplified characters, the PinYin in [...] and then English meanings.

I used the following shell script to make a list of character/PinYin pairs:

	
#!/bin/bash

# get pairs of character + pinyin by throwing away other stuff in the dictionary

awk '{print $2, $3}' cedict_ts.u8 | grep -v '[A-Z]' | 
  grep -v '^.[^ ]' | sed -e 's/\[//' -e 's/\]//' -e 's/[0-9]$//' | 
    sort | uniq -w 1 > pinyinmap.txt
	
      

to give lines such as

	
好 hao
妁 shuo
如 ru
妃 fei
	
      

This can then be read into a Java Map, and then quick lookups can be done to translate Chinese to PinYin.

Karaoke player with sampling

The Karaoke player described so far is functionally equivalent to kmidi and pykar. It plays KAR files, shows the notes and scrolls through the lyrics. To sing along with it, you need to use an ALSA or PulseAudio player.

But Java can also play sampled sounds, as discussed in an earlier chapter. So that code can be brought into the Karaoke player to give a more complete solution. For MIDI, Java normally gives only a Gervill synthesizer which is a software synthesizer that plays out through The PulseAudio default device. The actual output device is not accessible through Java, and is controlled by the underlying PulseAudio output device. But for sampled media, the input devices can be controlled. So in the following code a selection box allows choice of sampled input device, and leaves the output device to the default.

      
     /*
 * KaraokePlayer.java
 *
 */

import javax.swing.*;
import javax.sound.sampled.*;

public class KaraokePlayerSampled {


    public static void main(String[] args) throws Exception {
	if (args.length != 1) {
	    System.err.println("KaraokePlayer: usage: " +
			       "KaraokePlayer <midifile>");
	    System.exit(1);
	}
	String	strFilename = args[0];

	Mixer.Info[] mixerInfo = AudioSystem.getMixerInfo();

	String[] possibleValues = new String[mixerInfo.length];
	for(int cnt = 0; cnt < mixerInfo.length; cnt++){
	    possibleValues[cnt] = mixerInfo[cnt].getName();
	}
	Object selectedValue = JOptionPane.showInputDialog(null, "Choose mixer", "Input",
							   JOptionPane.INFORMATION_MESSAGE, null,
							   possibleValues, possibleValues[0]);

	System.out.println("Mixer string selected " + ((String)selectedValue));


	Mixer mixer = null;
	for(int cnt = 0; cnt < mixerInfo.length; cnt++){
	    if (mixerInfo[cnt].getName().equals((String)selectedValue)) {
		mixer = AudioSystem.getMixer(mixerInfo[cnt]);
		System.out.println("Got a mixer");
		break;
	    }
	}//end for loop


	MidiPlayer midiPlayer = new MidiPlayer();
	midiPlayer.playMidiFile(strFilename);

        SampledPlayer sampledPlayer = new SampledPlayer(/* midiPlayer.getReceiver(), */ mixer);
        sampledPlayer.playAudio();
    }
}




      
    

The code to play the sampled media is pretty much the same as we have seen before:

      
      


import java.io.IOException;

import javax.sound.sampled.Line;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.TargetDataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.Control;

import javax.swing.*;

public class SampledPlayer {

    private DisplayReceiver receiver;
    private Mixer mixer;

    public SampledPlayer(/* DisplayReceiver receiver, */ Mixer mixer) {
	this.receiver = receiver;
	this.mixer = mixer;
    }

    //This method creates and returns an
    // AudioFormat object for a given set of format
    // parameters.  If these parameters don't work
    // well for you, try some of the other
    // allowable parameter values, which are shown
    // in comments following the declarations.
    private static AudioFormat getAudioFormat(){
	float sampleRate = 44100.0F;
	//8000,11025,16000,22050,44100
	int sampleSizeInBits = 16;
	//8,16
	int channels = 1;
	//1,2
	boolean signed = true;
	//true,false
	boolean bigEndian = false;
	//true,false
	return new AudioFormat(sampleRate,
			       sampleSizeInBits,
			       channels,
			       signed,
			       bigEndian);
    }//end getAudioFormat

 

    public  void playAudio() throws Exception {
	AudioFormat audioFormat;
	TargetDataLine targetDataLine;
	
	audioFormat = getAudioFormat();
	DataLine.Info dataLineInfo =
	    new DataLine.Info(
			      TargetDataLine.class,
			      audioFormat);
	targetDataLine = (TargetDataLine)
	    AudioSystem.getLine(dataLineInfo);
	
	targetDataLine.open(audioFormat, 
			    audioFormat.getFrameSize() * Constants.FRAMES_PER_BUFFER);
	targetDataLine.start();
	
	/*
	for (Control control: targetDataLine.getControls()) {
	    System.out.println("Target control: " + control.getType());
	}
	*/

	playAudioStream(new AudioInputStream(targetDataLine), mixer);
    } // playAudioFile
     
    /** Plays audio from the given audio input stream. */
    public  void playAudioStream(AudioInputStream audioInputStream, Mixer mixer) {

	new AudioPlayer(audioInputStream, mixer).start();
    } // playAudioStream

    class AudioPlayer extends Thread {
	AudioInputStream audioInputStream;
	SourceDataLine dataLine;
	AudioFormat audioFormat;

	// YIN stuff
	// PitchProcessorWrapper ppw;

	AudioPlayer( AudioInputStream audioInputStream, Mixer mixer) {
	    this.audioInputStream = audioInputStream;

	    // Set to nearly max, like Midi sequencer does
	    Thread curr = Thread.currentThread();
	    Debug.println("Priority on sampled: " + curr.getPriority());
	    int priority = Thread.NORM_PRIORITY
		+ ((Thread.MAX_PRIORITY - Thread.NORM_PRIORITY) * 3) / 4;
	    curr.setPriority(priority);
	    Debug.println("Priority now on sampled: " + curr.getPriority());

	    // Audio format provides information like sample rate, size, channels.
	    audioFormat = audioInputStream.getFormat();
	    Debug.println( "Play input audio format=" + audioFormat );
     
	    // Open a data line to play our type of sampled audio.
	    // Use SourceDataLine for play and TargetDataLine for record.

	    if (mixer == null) {
		System.out.println("can't find a mixer");
	    } else {
		Line.Info[] lines = mixer.getSourceLineInfo();
		if (lines.length >= 1) {
		    try {
			dataLine = (SourceDataLine) AudioSystem.getLine(lines[0]);
			System.out.println("Got a source line for " + mixer.toString());
		    } catch(Exception e) {
		    }
		} else {
		    System.out.println("no source lines for this mixer " + mixer.toString());
		}
	    }

		for (Control control: mixer.getControls()) {
		    System.out.println("Mixer control: " + control.getType());
		}

		

	    DataLine.Info info = null;
	    if (dataLine == null) { 
		info = new DataLine.Info( SourceDataLine.class, audioFormat );
		if ( !AudioSystem.isLineSupported( info ) ) {
		    System.out.println( "Play.playAudioStream does not handle this type of audio on this system." );
		    return;
		}
	    }
     
	    try {
		// Create a SourceDataLine for play back (throws LineUnavailableException).
		if (dataLine == null) {	
		    dataLine = (SourceDataLine) AudioSystem.getLine( info );
		}
		Debug.println( "SourceDataLine class=" + dataLine.getClass() );
     
		// The line acquires system resources (throws LineAvailableException).
		dataLine.open( audioFormat,
			       audioFormat.getFrameSize() * Constants.FRAMES_PER_BUFFER);
     
		for (Control control: dataLine.getControls()) {
		    System.out.println("Source control: " + control.getType());
		}
		// Adjust the volume on the output line.
		if( dataLine.isControlSupported( FloatControl.Type.VOLUME) ) {
		    // if( dataLine.isControlSupported( FloatControl.Type.MASTER_GAIN ) ) {
		    //FloatControl volume = (FloatControl) dataLine.getControl( FloatControl.Type.MASTER_GAIN );		    
		    FloatControl volume = (FloatControl) dataLine.getControl( FloatControl.Type.VOLUME);
		    System.out.println("Max vol " + volume.getMaximum());
		    System.out.println("Min vol " + volume.getMinimum());
		    System.out.println("Current vol " + volume.getValue());
		    volume.setValue( 60000.0F );
		    System.out.println("New vol " + volume.getValue());
		} else {
		    System.out.println("Volume control not supported");
		}
		if (dataLine.isControlSupported( FloatControl.Type.REVERB_RETURN)) {
		    System.out.println("reverb return supported");
		} else {
		    System.out.println("reverb return not supported");
		}
		if (dataLine.isControlSupported( FloatControl.Type.REVERB_SEND)) {
		    System.out.println("reverb send supported");
		} else {
		    System.out.println("reverb send not supported");
		}
     

	    } catch ( LineUnavailableException e ) {
		e.printStackTrace();
	    }

	    // ppw = new PitchProcessorWrapper(audioInputStream, receiver);
	}


	public void run() {
     
	    // Allows the line to move data in and out to a port.
	    dataLine.start();
	    
	    // Create a buffer for moving data from the audio stream to the line.
	    int bufferSize = (int) audioFormat.getSampleRate() * audioFormat.getFrameSize();
	    bufferSize =  audioFormat.getFrameSize() * Constants.FRAMES_PER_BUFFER;
	    Debug.println("Buffer size: " + bufferSize);
	    byte [] buffer = new byte[bufferSize];
	    
	    try {
		int bytesRead = 0;
		while ( bytesRead >= 0 ) {
		    bytesRead = audioInputStream.read( buffer, 0, buffer.length );
		    if ( bytesRead >= 0 ) {
			int framesWritten = dataLine.write( buffer, 0, bytesRead );
			// ppw.write(buffer, bytesRead);
		    }
		} // while
	    } catch ( IOException e ) {
		e.printStackTrace();
	    }
     
	    // Continues data line I/O until its buffer is drained.
	    dataLine.drain();
     
	    Debug.println( "Sampled player closing line." );
	    // Closes the data line, freeing any resources such as the audio device.
	    dataLine.close();
	}
    }

    // Turn into a GUI version or pick up from prefs
    public void listMixers() {
	try{
	    Mixer.Info[] mixerInfo = 
		AudioSystem.getMixerInfo();
	    System.out.println("Available mixers:");
	    for(int cnt = 0; cnt < mixerInfo.length;
		cnt++){
		System.out.println(mixerInfo[cnt].
				   getName());
		
		Mixer mixer = AudioSystem.getMixer(mixerInfo[cnt]);
		Line.Info[] sourceLines = mixer.getSourceLineInfo();
		for (Line.Info s: sourceLines) {
		    System.out.println("  Source line: " + s.toString());
		}
		Line.Info[] targetLines = mixer.getTargetLineInfo();
		for (Line.Info t: targetLines) {
		    System.out.println("  Target line: " + t.toString());
		}
		
   
	    }//end for loop
	} catch(Exception e) {
	}
    }
}
      
    

Comments on device choices

if the default devices are chosen, the input and output devices are the PulseAudio default devices. Normally these would both be the computer's sound card. However, the default devices can be changed using for example, the PulseAudio volume control. These can set either the input device, the output device or both. The dialogue can also be used to set the input device for sampled media.

This raises a number of possible scenarios:

Using different devices raises the problem of "clock drift", where the devices have different clocks which are not synchronised. The worst case seems to be the second one, where over a three minute song on my system I could hear a noticeable lag in playing the sampled audio, while the KAR file played happily. It also introduced a noticeable latency in playing the sampled audio.

Performance

The program top can give a good idea of how much CPU is used by various processes. My current computer is a high-end Dell laptop with a quad-core Intel i7-2760QM CPU running at 2.4GHz. According to CPU Benchmarks the processor is in the "High End CPU Chart." On this computer, tested with various KAR files, PulseAudio takes about 30% of the CPU while Java takes about 60%. On occassions these figures exceeded. There is not much left for additional functionality!

In addition, while playing a MIDI file, sometimes the Java process hangs, resuming with upto 600% CPU usage (I don't know how top manages to record that)!. This makes it effectively unusable, and I am not sure where the problem lies.

Downloads

The Java source files are here and the Chinese lookup table is here

Conclusion

JavaSound has no direct support for Karaoke. This chapter has looked at how to combine the JavaSound libraries with other libraries such as Swing to give a Karaoke player for MIDI files. It requires a high-end computer to run these programs.



Copyright © Jan Newmarch, jan@newmarch.name
Creative Commons License
"Programming and Using Linux Sound - in depth" by Jan Newmarch is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License .
Based on a work at https://jan.newmarch.name/LinuxSound/ .

If you like this book, please contribute using PayPal

Or Flattr me:
Flattr this book