Hey all –
First, let me set up the situation: I have some machine-generated
C code that is really nothing more than the declaration and definition
of various byte arrays. I’d like to convert it via Ruby and SWIG to
another format. But something odd happens when I try to use
the following *.i file in SWIG:
%module bytearrays
%inline %{
#include “bytearrays.h” /* just contains typedef for Data */
extern Data * red;
extern Data * blue;
%}
typedef unsigned char Data;
%{
static Data data_get(Data * array, unsigned int idx) {
if (!array) return 0xFF; /* To do: throw exception */
return array[idx];
}
%}
Data data_get(Data * array, unsigned int idx);
If I declare the arrays in bytearrays.c like so, I can get irb to
segfault:
#include “bytearrays.h”
static Data red[8] = {
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07
};
static Data blue[18] = {
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07,
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07,
0x00, 0x01
};
That is, irb will choke on the following expression:
irb> Bytearrays.data_get(Bytearrays.red, 0)
But everything is fine if I do this:
#include “bytearrays.h”
static Data red_data[8] = {
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07
};
static Data blue_data[18] = {
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07,
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07,
0x00, 0x01
};
Data * red = red_data;
Data * blue = blue_data;
What gives? I’m pretty sure I’m missing something obvious, I just
don’t know what. For the record, I’m running this on MacOS X
10.4.8 (Intel) with Ruby 1.8.5.
Thanks in advance,
– Dan C.