Better platform to turn software into VHDL/Verilog for an FPGA
Asked Answered
I

5

5

I am looking at developing on an FPGA, but it would be easier for me to write the code in Python or Scala and have it converted to VHDL or Verilog.

I want to have many sensors hooked up to a device, and as the data comes in, calculations are done very quickly so it can be displayed on a video wall, so the FPGA would have as input dozens of sensors and several video controllers for the wall.

This is a library for code written in Scala. For this one I am curious if the code is written in Java and Scala would that affect what it generates.

http://simplifide.com/drupal6/

This is a python to VHDL converter.

http://www.myhdl.org/doku.php

With both of these I am curious as to the limitations.

I would prefer simplifide, as I am stronger at Scala than Python, but it seems that myhdl may be a more robust platform, just from some basic looking around.

UPDATE:

The reason for the FPGA is that it can do multiple tasks at one time very well, so when the data comes in, depending on the needs of users, based on the experiment, it would be easy to change the code on the FPGA to be able to adapt to the needs.

So, for example, if you have 8 x 3 different weather sensors on each floor of an office building, so there are temperature, wind speed, barometric sensors (8 of each sensor one each floor), and add sensors to test the deformation of the walls, then a real-time interface to read these in at the same time and keep updating the visual display may be helpful.

This is a made up example, but it would explain why an FPGA would be useful, otherwise I would need many different DSPs and then feed that into a computer to do the visual display, whereas an FPGA can do is faster, since it is hardware, with lower power needs.

There are two open-source libraries that can help make this easier in development, but I am not certain which would be a better platform to use to convert a program to VHDL/Verilog.

This is just one example. If I want to do a quantum circuit simulation on an FPGA, as this article suggests (http://www.cc.gatech.edu/computing/nano/documents/Radecka%20-%20FPGA%20Emulation%20of%20Quantum%20Circuits.pdf) then it would be easier to do this as a program, than building up a large circuit by hand.

Initiate answered 14/10, 2012 at 1:58 Comment(7)
Can you explain in more detail why you need an FPGA? What sort of computations do you need? This sounds like it could be solved with a standard data acquisition system.Onceover
Might want to narrow the scope of your question.Bagehot
@Bagehot - There are at least two different platforms that can do what I would like, but I don't know which is better, basically.Initiate
'doing many things at once` requiring an fpga or many dsp's is likely when the data rate is in millions of samples per second. your suggestion, weather, probably doesn't need to be sampled more than a few hundred times per second; You can probably churn through hundreds of sensors worth (100 * 100 is 10,000 <<< 1,000,000) of data on a less expensive MCU (of the ARM variety) which could also be programmed in python directly, and is even easier to reprogram than many FPGAsDibri
I am interested in the question as it stands, given that I'd like to perform dozens of 3D convolutions on volumetric data coming in at ~200 MB/s with latencies of under 10 ms (preferably a fraction of a millisecond; whatever the limit, it's a hard limit).Neonate
@Rex Kerr Page 118 of this thesis may be relevant.Onceover
@Adam12 - As my robotics moves beyond what I want to do with a Parallax processor, for converting real-time image acquisition into 3D vector maps, for example, FPGAs may make life easier. But, I am just trying to see which one may make more sense, rather than just writing something in both and then testing, but it appears that may be the best option.Initiate
N
3

If you can afford it, I don't think anything will make your life easier than National Instruments' FPGA add-on for LabView. The visual environment of LabView is a reasonable fit for FPGA programming, and it takes care of many of the annoying details for you (unless you must worry about them as part of the algorithm, e.g. by building pipelines to hit your clock speed targets). Also, you may find that NI's real-time (non-FPGA) or DSP or DAQ or other solutions are adequate for your needs.

Neonate answered 14/10, 2012 at 16:23 Comment(3)
I thought about LabView, but I would prefer to do it with Python or Scala, or some similar language, if possible. But then, I haven't used LabView in 15 years so I expect it has changed a great deal since I used it.Initiate
@JamesBlack - It's changed only moderately in 15 years. But were you using it for FPGA programming before? Don't confuse "I had to use LabView to draw circuits that were supposed to emulate some lines of code accessing a bunch of data structures in interesting ways to do interesting things", which was and continues to be exceedingly painful, with "I will use LabView to draw circuits that are supposed to represent something that emulates circuits." Using a tool for that which it is well-suited is a much different experience than using it for something it is not.Neonate
I used it to read from equipment such as digital oscilloscopes, to send data over a socket.Initiate
E
6

Yes there is a python style HDL available and its free. MYHDL

This will generate VHDL or verilog . It can also simulate the code and output .VDI and you can look that in gtkwave

alternately if you want to edit the VHDL code you can use GHDL. google it you get lots of resources. there is OS available Fedora Electronics Lab it has all the tools to develop modern electronics.

All these are open source. Build and simulate using these tools. To flash into the FPGA you need either xilinx or Altera tool chains to generate the bitstreams and flash them. All the best !

Elaterium answered 30/10, 2013 at 12:20 Comment(5)
the fedora electronic lab looks awesome! fedoraproject.org/wiki/Electronic_Lab?rd=ElectronicLab_SpinPound
@Pound you are commenting on a 4 year old post. And the link you give shows "last updated ... 2009". That's 8 years ago. Living in the past?Nisen
@JHBonarius, Sorry I did not pay much attention to that. Thanks for catching this. But the project looks more advanced than gEDA, and included everything that I might think about. Is the project as a whole dead? Are the packages or at least some of the packages still alive? If some of the packages are alive, then the link might be of some value?Pound
By the way, I recently used myhdl, and know they are adding new features to it (e.g. https://mcmap.net/q/1924080/-python-myhdl-package-how-to-generate-verilog-initial-block/362754). So the topic isn't really stale. :)Pound
I almost visited Eindhoven a while ago.Pound
N
3

If you can afford it, I don't think anything will make your life easier than National Instruments' FPGA add-on for LabView. The visual environment of LabView is a reasonable fit for FPGA programming, and it takes care of many of the annoying details for you (unless you must worry about them as part of the algorithm, e.g. by building pipelines to hit your clock speed targets). Also, you may find that NI's real-time (non-FPGA) or DSP or DAQ or other solutions are adequate for your needs.

Neonate answered 14/10, 2012 at 16:23 Comment(3)
I thought about LabView, but I would prefer to do it with Python or Scala, or some similar language, if possible. But then, I haven't used LabView in 15 years so I expect it has changed a great deal since I used it.Initiate
@JamesBlack - It's changed only moderately in 15 years. But were you using it for FPGA programming before? Don't confuse "I had to use LabView to draw circuits that were supposed to emulate some lines of code accessing a bunch of data structures in interesting ways to do interesting things", which was and continues to be exceedingly painful, with "I will use LabView to draw circuits that are supposed to represent something that emulates circuits." Using a tool for that which it is well-suited is a much different experience than using it for something it is not.Neonate
I used it to read from equipment such as digital oscilloscopes, to send data over a socket.Initiate
O
1

This is a made up example, but it would explain why an FPGA would be useful, otherwise I would need many different DSPs and then feed that into a computer to do the visual display, whereas an FPGA can do is faster, since it is hardware, with lower power needs.

This depends entirely on the exact nature of the algorithms you need to execute.

There are two open-source libraries that can help make this easier in development, but I am not certain which would be a better platform to use to convert a program to VHDL/Verilog.

This is just one example. If I want to do a quantum circuit simulation on an FPGA, as this article suggests (http://www.cc.gatech.edu/computing/nano/documents/Radecka%20-%20FPGA%20Emulation%20of%20Quantum%20Circuits.pdf) then it would be easier to do this as a program, than building up a large circuit by hand.

It looks like you're looking for a High Level Synthesis tool, which neither of those is. For generating RTL for algorithms, code generators can definitely help but you'll still have to get your hands dirty with HDLs for other things.

Onceover answered 14/10, 2012 at 17:41 Comment(1)
I don't mind doing part of it, if the major part can be more automated.Initiate
P
1

I think you are looking for some tooling that Modaë Technologies is offering here. You can start with either Ruby or Python code, at the algorithmic/behavioral level. Their tools are capable of inferring data types automatically and converting the code to HDL (currently VHDL) at the RTL level.

Precipitous answered 29/9, 2014 at 18:25 Comment(0)
C
1

One or two years ago I worked with MyHDL and LabVIEW. I wrote HDL in MyHDL, exported it as VHDL and imported them as external IP in LabVIEW.

LabView

It's nice for FPGA development, because you can keep track of clocks consumed by each sequential branch because of it's graphical representation. Pipelining and slicing your algorithm graphically is worth a lot to make sure that correct values are processed together. However, when it comes to generating constants, initializer lists or generate recursive structures, the mapping from visual to actual hardware this approach is ... sub-optimal.

MyHDL

It's basically python, but the syntax looks a lot like Verilog with the decorators dedicated to be used as @always, etc. statements. You're free to use any python-valid code in your HDL code. For testing purposes it might be applicable to write a test function with python before acutally implementing it on register transfer level (RTL). When generating recursive structures you have the usual for statement. Need a look-up table (LUT) for your algorithm? One line of list comprehension: Done. For a full list of features, see the website.

Summary

LabVIEW is great for beginning, because you can focus on the actual implementation. AS soon as you mastered to think parallel and on RTL and look into implementing more complex algorithms you may find MyHDL being better in

  • managing your IP without being bound to a proprietary platform
  • testing your code with full python power
  • sharing, versioning, etc.
Conversational answered 7/3, 2018 at 10:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.